Showing posts with label Amazon web service. Show all posts
Showing posts with label Amazon web service. Show all posts

Amazon ECS CLI version 1.0.0


Amazon ECS CLI version 1.0.0


Amazon ECS CLI VERSION 1.0.0
The Amazon EC2 Container Service (Amazon ECS) command line interface (CLI) is now available as version 1.0.0. 
The Amazon ECS CLI is a developer tool that is built to make it easy to run applications with Amazon ECS. This CLI allows you to create and manage Amazon ECS clusters and tasks with fewer commands when using a terminal interface. 
Previously, the Amazon ECS CLI only allowed you to configure one set of clusters and credentials for testing containers and was only available for Linux or Mac environments. 
Now, the Amazon ECS CLI v1.0.0 allows you to store multiple cluster and credential configurations and allows you to easily switch between different cluster and credential configuration. You can specify additional ECS task definition parameters and the CLI is available for Linux, Mac, and Windows environments. 

To learn more, visit the Amazon ECS CLI documentationor the ECS CLI project on Github. For more information about Amazon ECS, visit the product page here.
Amazon ECS is available in US East (Ohio), US East (N. Virginia), US West (N. California), US West (Oregon), Canada (Central), Asia Pacific (Seoul), Asia Pacific (Singapore), Asia Pacific (Sydney), Asia Pacific (Tokyo), EU (Frankfurt), EU (Ireland), EU (London), and China (Beijing) regions. For more information on AWS regions and service, please visit here

Amazon S3

Amazon Web Services - Amazon S3

Amazon S3

Amazon S3 (Simple Storage Service) is a scalable, high-speed, low-cost web-based service designed for online backup and archiving of data and application programs. It allows to upload, store, and download any type of files up to 5 GB in size. This service allows the subscribers to access the same systems that Amazon uses to run its own web sites. The subscriber has control over the accessibility of data, i.e. privately/publicly accessible.



How to Configure S3?


Following are the steps to configure a S3 account.

Step 1 − Open the Amazon S3 console using this link − https://console.aws.amazon.com/s3/home

Step 2 − Create a Bucket using the following steps.

A prompt window will open. Click the Create Bucket button at the bottom of the page.


Create a Bucket dialog box will open. Fill the required details and click the Create button.


The bucket is created successfully in Amazon S3. The console displays the list of buckets and its properties.


Select the Static Website Hosting option. Click the radio button Enable website hosting and fill the required details.


Step 3 − Add an Object to a bucket using the following steps.

Open the Amazon S3 console using the following link − https://console.aws.amazon.com/s3/home

Click the Upload button.


Click the Add files option. Select those files which are to be uploaded from the system and then click the Open button.


Click the start upload button. The files will get uploaded into the bucket.

To open/download an object − In the Amazon S3 console, in the Objects & Folders list, right-click on the object to be opened/downloaded. Then, select the required object.




How to Move S3 Objects?


Following are the steps to move S3 objects.

step 1 − Open Amazon S3 console.

step 2 − Select the files & folders option in the panel. Right-click on the object that is to be moved and click the Cut option.


step 3 − Open the location where we want this object. Right-click on the folder/bucket where the object is to be moved and click the Paste into option.


How to Delete an Object?


Step 1 − Open Amazon S3.

Step 2 − Select the files & folders option in the panel. Right-click on the object that is to be deleted. Select the delete option.

Step 3 − A pop-up window will open for confirmation. Click Ok.




How to Empty a Bucket?


Step 1 − Open Amazon S3 console.

Step 2 − Right-click on the bucket that is to be emptied and click the empty bucket option.


Step 3 − A confirmation message will appear on the pop-up window. Read it carefully and click the Empty bucket button to confirm.


Amazon S3 Features


Low cost and Easy to Use − Using Amazon S3, the user can store a large amount of data at very low charges.

Secure − Amazon S3 supports data transfer over SSL and the data gets encrypted automatically once it is uploaded. The user has complete control over their data by configuring bucket policies using AWS IAM.

Scalable − Using Amazon S3, there need not be any worry about storage concerns. We can store as much data as we have and access it anytime.

Higher performance − Amazon S3 is integrated with Amazon CloudFront, that distributes content to the end users with low latency and provides high data transfer speeds without any minimum usage commitments.

Integrated with AWS services − Amazon S3 integrated with AWS services include Amazon CloudFront, Amazon CLoudWatch, Amazon Kinesis, Amazon RDS, Amazon Route 53, Amazon VPC, AWS Lambda, Amazon EBS, Amazon Dynamo DB, etc.

AWS Simple Workflow Service

AWS - Simple WorkFlow Service



AWS- Simple WorkFlow Service

The following services fall under Application Services section −



Amazon CloudSearch
Amazon Simple Queue Services (SQS)
Amazon Simple Notification Services (SNS)
Amazon Simple Email Services (SES)
Amazon SWF
In this chapter, we will discuss Amazon SWF.

Amazon Simple Workflow Service (SWF) is a task based API that makes it easy to coordinate work across distributed application components. It provides a programming model and infrastructure for coordinating distributed components and maintaining their execution state in

a reliable way. Using Amazon SWF, we can focus on building the aspects of the application that differentiates it.

A workflow is a set of activities that carry out some objective, including logic that coordinates the activities to achieve the desired output.

Workflow history consists of complete and consistent record of each event that occurred since the workflow execution started. It is maintained by SWF.



How to Use SWF?


Step 1 − Sign in to AWS account and select SWF on the Services dashboard.

Step 2 − Click the Launch Sample Walkthrough button.

Amazon EC2 P3 Instances


Introducing Amazon EC2 P3 Instances

 
Amazon EC2 P3 Instabces


We are eager to declare the accessibility of Amazon EC2 P3 examples, the up and coming age of EC2 process enhanced GPU occurrences. P3 cases are fueled by up to 8 of the most recent age NVIDIA Tesla V100 GPUs and are perfect for computationally propelled workloads, for example, machine learning (ML), superior processing (HPC), information pressure, and cryptography. They are additionally perfect for particular industry applications for logical figuring and reenactments, monetary investigation, and picture and video preparing. 

P3 examples give a capable stage to ML and HPC by additionally utilizing 64 vCPUs utilizing the custom Intel Xeon E5 processors, 488 GB of RAM, and up to 25 Gbps of total system transmission capacity utilizing Elastic Network Adapter innovation. 

In view of NVIDIA's most recent Volta design, every Tesla V100 GPUs give 125 TFLOPS of blended exactness execution, 15.7 TFLOPS of single accuracy (FP32) execution and 7.8 TFLOPS of twofold exactness (FP64) execution. This is conceivable on the grounds that every Tesla V100 GPUs contains 5,120 CUDA Cores and 640 Tensor Cores. A 300 GB/s NVLink hyper-work interconnect enables GPU-to-GPU correspondence at rapid and low idleness. 


For ML applications, P3 examples present to 14x execution change over P2 occurrences, enabling engineers to prepare their machine learning models in hours rather than days, and offer their advancements for sale to the public quicker. 

P3 occasions are accessible in three example sizes, p3.2xlarge with 1 GPU, p3.8xlarge with 4 GPUs and p3.16xlarge with 8 GPUs. They are accessible in US East (N. Virginia), US West (Oregon), EU West (Ireland) and Asia Pacific (Tokyo) districts. Clients can buy P3 examples as On-Demand Instances, Reserved Instances, Spot Instances, and Dedicated Hosts.
Also Read: Amazon Web Services mobile app

Amazon Aurora(MySQL)

Amazon Aurora (MySQL) Doubles Maximum Write Throughput with Support for R4 Instances


Amazon Aurora (MySQL)

Beginning today, you can dispatch cases in the R4 family when utilizing Amazon Aurora (MySQL). R4 is the up and coming age of memory-enhanced cases and enhances the mainstream R3 examples with a bigger L3 store and speedier memory. 

AWS-Virtual Private Cloud

A R4.16xlarge occurrence, the biggest in the family, has 64 centers and 488GiB of memory. Amazon Aurora (MySQL) can process up to 200,000 composes/second on a R4.16xlarge occurrence, which is twofold the past most extreme bolstered by Amazon Aurora (MySQL) on a R3.8xlarge occasion. 

AWS-WorkSpaces

You can utilize R4 examples on Amazon Aurora (MySQL) rendition 1.15. R4 occurrences are accessible in all AWS locales where Amazon Aurora (MySQL) is accessible. For more data on evaluating, visit the valuing page. To take in more about Amazon Aurora, a MySQL-and PostgreSQL-good social database that joins the speed and accessibility of top of the line business databases with the effortlessness and cost-adequacy of open source databases, please visit the Amazon Aurora item page.

AWS-Direct Connect

AWS-Direct Connect

Amazon Web Services - Direct Connect 

AWS-Direct Connect

AWS Direct Connect grants to make a private system association from our system to AWS area. It utilizes 802.1q VLANs, which can be divided into different virtual interfaces to get to open assets utilizing a similar association. This outcomes in diminished system cost and expanded transfer speed. Virtual interfaces can be reconfigured whenever according to the necessity.

Necessities to Use AWS Direct Connect 

Our system must meet one of the accompanying conditions to utilize AWS Direct Connect −

Our system ought to be in the AWS Direct Connect area. Visit this connect to think about the accessible AWS Direct Connect locationshttps://aws.amazon.com/directconnect/.

We ought to work with an AWS Direct Connect accomplice who is an individual from the AWS Partner Network (APN). Visit this connect to know the rundown of AWS Direct Connect accomplices − https://aws.amazon.com/directconnect/

Our specialist co-op must be compact to interface with AWS Direct Connect.

Furthermore, our system must meet the accompanying essential conditions −

Associations with AWS Direct Connect requires single mode fiber, 1000BASE-LX (1310nm) for 1 gigabit Ethernet, or 10GBASE-LR (1310nm) for 10 gigabit Ethernet. Auto Negotiation for the port must be impaired. Support for 802.1Q VLANs over these associations ought to be accessible.

System must help Border Gateway Protocol (BGP) and BGP MD5 verification. Alternatively, we may arrange Bidirectional Forwarding Detection (BFD).

Also Read:Amazon Virtual private Cloud

How to Configure AWS Direct Connect? 


Following are the means to design AWS Direct Connect −

Stage 1 − Open the AWS Direct Connect reassure utilizing this connection −https://console.aws.amazon.com/directconnect/

stage 2 − Select AWS Direct Connect area from the route bar.

stage 3 − Welcome page of AWS Direct Connect opens. Select Get Started with Direct Connect.

stage 4 − Create a Connection exchange put away opens. Fill the required points of interest and tap the Create catch.

AWS will send an affirmation email inside 72 hours to the approved client.

Stage 5 − Create a Virtual Interface utilizing the accompanying advances.

Open AWS comfort page once more.

Select Connection in the route bar, at that point select Create Virtual Interface. Fill the required points of interest and tap the Continue catch.

Check the Virtual Interface (discretionary). To check the AWS Direct Connect associations utilize the accompanying methodology.

To check virtual interface association with the AWS cloud − Run traceroute and confirm that the AWS Direct Connect identifier is in the system follow.

To check virtual interface association with Amazon VPC − Use any pingable AMI and dispatch Amazon EC2 occasion into the VPC that is connected to the virtual private passage.

At the point when an occasion is running, get its private IP address and ping the IP deliver to get a reaction.

Also Read:AWS Route 53

Features of Direct Connect 



Decreases transmission capacity costs − The cost gets diminished in both ways, i.e. it exchanges the information to and from AWS specifically. The information exchanged over your committed association is charged at diminished AWS Direct Connect information exchange rate as opposed to Internet information exchange rates.

Perfect with all AWS administrations − AWS Direct Connect is a system benefit, bolsters all the AWS administrations that are open over the Internet, similar to Amazon S3, Amazon EC2, Amazon VPC, and so forth.

Private availability to Amazon VPC − AWS Direct Connect can be utilized to build up a private virtual interface from our home-system to Amazon VPC straightforwardly with high data transfer capacity.

Flexible − AWS Direct Connect gives 1 Gbps and 10 Gbps associations, having arrangement to make various associations according to prerequisite.

Simple and basic − Easy to join on AWS Direct Connect utilizing the AWS Management Console. Utilizing this support, every one of the associations and virtual interfaces can be overseen.

advertisment