AWS re:Invent -New services and features

Amazon Web Service’s has come up with many new services and features in recently held AWS re:Invent 2016. In this blog post I am writing about new announcements related to storage and data migration .

AWS Snowmobile:

AWS snowmobile is nothing but a beast to carry petabytes of your data to AWS cloud over a truck. yes on a truck, I am not kidding 🙂

snow-ball

This secure data truck stores up to 100 PB of data and can help you to move exabytes to AWS in a matter of weeks. Snow mobile attaches to you data center network and
appears as local NFS mount volume. It includes a network cable connected to a high-speed switch capable of supporting 1 Tb/sec of data transfer spread across
multiple 40 Gb/sec connections. As per AWS official blogs Snowmobile is available in all AWS Regions. We need to contact AWS Sales team to use this service.

AWS Snowball Edge:

The new Snowball Edge appliance has all the features of its twin brother Snowball which is launched in 2015.

snow-ball-ede

AWS Snowball Edge is Petabyte-scale data transport with  on-board storage and compute. It arrives with your S3 buckets, Lambda code, and clustering configuration pre- installed. you can execute AWS Lambda functions and process data locally on the AWS Snowball Edge.You can order Snowball Edge with just a few clicks in the AWS Management Console.

Amazon EFS – New Feature

Amazon Elastic File System (Amazon EFS) provides simple file storage for use with Amazon EC2 instances in the AWS Cloud. With new feature we can mount Amazon EFS file systems on our on-premises datacenter servers when connected to Amazon VPC with AWS Direct Connect.

S3 Storage Management with new features

S3 Object Tagging:

S3 Object tags are key-value pairs applied to S3 objects which can be created, updated or deleted at any time during the lifetime of the object.With
these users have the ability to create IAM policies, setup Lifecycle policies, and customize storage metrics.

S3 Analytics, Storage Class Analysis:

This new S3 Analytics feature automatically identifies the optimal lifecycle policy to move less frequently accessed storage to S3 Standard – Infrequent Access .Users
can configure a storage class analysis policy to monitor an entire bucket, a prefix, or object tag. Once an infrequent access pattern is observed, we can easily create
a new lifecycle age policy based on the results.

S3 CloudWatch Metrics :

This helps understand and improve the performance of applications that use Amazon S3 by monitoring and alarming on 13 new S3 CloudWatch  Metrics. Users can receive 1-minute CloudWatch Metrics, set alarms, and access dashboards to view real-time operations and performance such as bytes downloaded from  their Amazon S3 storage.

you can read my blog post related to Data migration to AWS by clicking below link.Thanks and happy reading 🙂

Cloud data migration with AWS

 

Advertisement

Cloud data migration with AWS

Data migration is a key challenge in any cloud migration and as a storage admin it always fascinated me to understand efforts it take to migrate petabytes of data to the public cloud.In this post I will try to give a brief outline of  3 out of 8 ways in which we can migrate data to Amazon web services.

AWS Direct Connect:-In AWS direct connect we will be having a dedicated network connection from your  data center premises to AWS. With the high available speeds you can either directly copy data from any of your server  to an S3 bucket using cli commands or do host based migration to any ec2 instance with sufficient number of EBS volumes.
Multiple connections can be used simultaneously for increased bandwidth or redundancy.We can also use the AWS  partner network in case AWS direct connect location is not available near to your data centers.

directcnnect
AWS direct Connect

 

AWS Import/Export Snowball:- AWS import export snowball is petabyte scale storage migration solution. AWS will ship you a storage device as shown below to you data center,  which can copy 50 or 80 TB of storage.

snowball
Snowball

once you receive a Snowball, you plug it in, connect it to your network, configure the IP address , and install the AWS Snowball client. Use the client to identify the directories you want to copy. Data will be encrypted while copying to snowball  and decrypted when AWS offloads it to S3. As per AWS it takes 21 hours to copy 80TB of data from your data source to a Snowball  by using a 10 Gbps at 80 percent network utilization. AWS has also shown use case where customer was able to migrate 1PB of data in 1 week time using multiple snowballs parallel.

AWS storage gateway:- Storage gateway is installed on a local host in your data center. It creates an on-premises virtual appliance that provides seamless and secure integration between your on-premises applications and AWS’s storage infrastructure.It can create iSCSI volumes for storing all or recently accessed data on premise for faster response while asynchronously  uploading this data to Amazon S3 in background.

GW
AWS Storage Gateway

A combination of AWS snow ball with either direct  connect or storage gateway will help in making migration much faster and easier.We can do a one time migration of data using snow ball and later make differential data update using direct connect or storage gateway.Hope this has given some basic idea on migrations with AWS solutions. Thanks for reading.

You can also read my blog post on a NextGen SSD Interface-NVMe .click on below link. Happy reading. 🙂

https://sskanth.com/2016/04/20/nvme-nextgen-sd-interface/

 

 

 

An AWSomeday

AWS event got a lot of craze among IT folks in Hyderabad. Almost 400 odd IT engineers turned up to this event on Nov17 ,2015 at ITC Kakatiya, Begumpet, Hyderabad. I know most of folks registered for event but delegates are invited based on some selection process by AWS. This post is to give glimpse of how exactly event happened and take aways from the event.

Event started with a welcome note from Chandra balani, Head of Business development. He gave us quick view of AWS history. Chandra said that AWS is available to public from 2006 ,but prior that they used this technology to run Amazon.com site for almost a decade. So AWS has total experience of 20 years in the cloud industry.

awsomeday

The tech part started with Harshith taking the charge. He is a great guy. I remember him from last years AWSomeday event. He carry’s entire event on his shoulders with all technical stuff. He gave deeper understanding of AWS core and application services. Showed how to deploy and automate your infrastructure on the AWS Cloud. We are given a student guide with information on AWS storage, compute ,network and applications.

 

The event has it’s fun part too. There are lots of contests on twitter with hash tag #AWSomday. There are stalls by AWS partners &experts. AWS experts are really nice in answering most of the questions from delegates. Event ended with goodies presentation to lucky winners and participant certificate to delegates.