Hello, I have attached the file containing the 19 page document That was completed in my previous Order# XXXXXXXXXXand also feedback from professorIf you all could kindly include screenshots added...






Hello, I have attached the file containing the 19 page document That was completed in my previous Order#114452. and also feedback from professor




If you all could kindly include screenshots added from the AWS Academy console environment lab into the report that would allow me to pass , I raised a previous ticket but the screenshots added were not taken from AWS Academy work that was suppose to of been done there.





The way my Assignment for my Order before this (order#113949) is a perfect example of how the report should be completed.






Please view last page of included Instructions document for Login information





Thank you,



Executive Summary Cloud computing architecture provides a simple way for stating subcomponents and components for providing the advantage in terms of storage, sharing, maintenance and flexibility. Cloud computing refers to services like databases, software and storage. It provides the ability to design, manage and build the cloud platform. The study describes implementation of cloud computing architecture in BallotOnline. It provides interfaces and applications that are required for cloud-based service. It consists of client side applications and includes components that consist of hardware and software components like server, data storage and virtualization software. It provides GUI to end-users for performing respective tasks. Plan Scope ●Cloud Advanced Features Overview Cloud computing is important for BallotOnline and it ensures continuous business growth and expansion that requires large-scale storage systems and computation power. The feature of cloud includes resource pooling and it is an essential feature of cloud computing. It means that the cloud service provider that can share the resource among multiple clients and provides different sets of services according to needs (Yoo & Kim, 2018). The on-demand self-service enables the client for continuously capabilities for monitoring, allocate network storage and monitor service uptime. It is the fundamental feature of cloud computing and it can help in controlling the computing capabilities according to the need. Data security is an important feature of the cloud. It ensures that the data is safe and the features come easily and it can be included with multiple users that are working with the particular file that comes with real-time. Automation is the feature that provides the ability for automatically configure, install and maintain cloud services. It can help in minimizing human effort and it requires constant maintenance. Advanced Data Protection Solutions in the Cloud ●Data Backup and Restore Using Amazon S3 Amazon S3 is lightweight and it provides the capability for storing the data. It includes a web service that provides resizable and secure computing capability in the cloud. Recovery and backup is important for disaster management. Backing up the data is important in the case of loss of data because it provides a big chance for the current environment that is vital in the case that is needed for rolling back the latest working environment (Alenezi, Atlam & Wills, 2019). It provides elastic and durable storage that can be used for static files, application data and it can be commonly used for restoring operations and file-level backups. It provides highly durable and reliable storage for providing the storage solution. Amazon S3 provides seamless integration and enables the on-premise environment for using assets that are backed up. It includes the purpose for using and storing database snapshots. Amazon S3 provides backup and recovery for different storage classes in the Amazon S3 buckets. It can be chosen between different storage tiers and it provides better storage categories. There are many storage system and external backup options for supporting Amazon S3 API and it requires accessing S3 storage through the proprietary interface. It includes use of built-in tools provided by Amazon for backup the data to S3. The data can be backed up from S3 bucket to another using CLI tools and AWS SDKs. These tools can be used on multiple pertaining systems (Lnenicka, & Komarkova, 2019). The use of tools can help in transferring the data from S3 bucket and the data is copied between storage tiers for recovery and backup. The backup to S3 commonly uses line scripts and this method is applicable for data backup of S3 bucket, physical machines, virtual machines and EC2 instances. Implementation of Amazon S3 requires maintaining data consistency during the process of backup and it required using features like volume snapshots in the operating system. In the backup and restore process on Amazon S3, it uses AWS storage gateway for transferring the data to the Amazon S3 bucket. The AWS Storage gateway provides hybrid storage that is deployed as the virtual machine. The main advantage is that it provides a caching option for accessing files faster. There are three types of AWS storage gateway that includes file gateways, tape gateways and volume gateways (Ilieva et al. 2019). After the deployment of the gateway, the user can access the Amazon S3 storage using the standard shared protocol like iSCSI, SMB and NFS. The storage gateway is available for virtual appliances on the VMware vSphere. The configuration of Amazon S3 lifecycle policy is required for managing the data and it is sent to the specified storage class for each phase of the life cycle. This allows storing, recovering and retrieving the version of any object stored in the bucket and it can help in making data recovery easier. ●Summary Amazon S3 provides centralized backup and includes storing the application for storing the data in S3 alongside other AWS services for compute, database and storage. The use of a single backup policy can provide a centrally automated system for creation of backups of the application data. The backup provides an automated way for organizing the backups across different AWS services. Amazon S3 can create continuous restore and backup application data stored in S3 and it provides point-in-time with a single click. AWS backup allows to back up the S3 data along with metadata. It allows for restoring all backed-up data and metadata accepting version ID, storage class and creation data. The lifecycle management policy allows for defining the timeline for the backup and it provides many versions of the same object that are created for making the same records. Advanced Data Security Solutions in the Cloud ●AWS Key Management Services Overview AWS Key Management Services is useful and it provides beneficial services for dealing with sensitive data and it can help in making it easy for creating and managing the cryptographic keys. It provides resilient and safe services that use hardware security protocols that are required to be tested and it includes the process for being tested to protect keys. It provides with requirements along with highly available key management, storage and auditing solution for encrypting data within the application. It provides control for encryption of stored data across the AWS services. The feature of AWS Key Management Services includes easy ways for accessing and controlling the data using managed encryption (Borangiu et al. 2019). The process of key management can be reduced with simple clicks and it includes some AWS services like Amazon S3, Amazon RedShift and Amazon EBS for simplifying the encryption of data in the service. It enables creating, disabling, enabling and rotating usage policies for master key and auditing usage. AWS Key Management Services is used for encrypting the data. The main purpose is to manage and store encryption keys. Data encryption is important for sensitive data that ensures that it cannot be accessed by unauthorized users. The implementation of data encryption is important for both data at rest and the data in transit. Client-side encryption is required for encrypting the data from client-side and it is sent all the way to the server and provided with backend services like RedShift, S3 and EBS. Server-side encryption can help in encrypting the data and managing keys that provide backend services to encrypt the data and it can help in managing keys. The access to encrypted data by assigning the permission includes the use of keys for dealing with physical and long-lasting security of keys for enforcing the permission. It provides centralized key management that provides a single point for defining policies across AWS services. AWS Key Management Services is integrated with AWS services for simplifying the encryption of data. It monitors the use of key that provides the view for testing the data. AWS Key Management Services uses simple APIs that can build encryption and it provides key management into the application. It can help in maintaining integrity of the data and enables users to use the platform with digital signing using the asymmetric key pairs. It uses a hardware security module for validating for protecting and generating keys. Keys can be used inside devices and it should never be shared outside the AWS region (Liu et al. 2018). Security is the priority and it aims for easing the possibility of using encryption for protecting the data and it is beyond the basic access control. It includes supporting and building encryption tool that can work with the cloud platform. It can help BallotOnline in securing the data and ensures compliances across the entire environment. It puts security in the center and ensures that the data is protected in the cost-effective way. ●Summary AWS Key Management Services provides users with centralized control over the encryption keys that are used for defending the user data. The user can use FIPS 140-2 valid hardware security modules for guarding the safety of user keys. AWS Key Management Services is integrated with different AWS services for assisting users to defend the information in which users can link services. AWS Key Management Services is integrated with AWS CloudTrail and it can help in supplying users that have logs along with key usage for assisting the meet of user compliance and regulatory requirements. The user will simply rotate, import and produce keys for providing the outline usage audit and policies usage from the AWS Management console. AWS Securing Data at Rest with Encryption ●Overview AWS KMS provides creating firewall, rule group encryption and firewall policy with the customer managed key. It is used for giving Network Firewall access to the KMS kets in the customer account. The key policies control access for customer managed keys ensures one key policy that contains the statement for determining the use of the key. The AWS KMS keys and the encryption key are represented with key materials that exist in the plaintext in the hardware security modules. The key materials are encrypted and stored in the durable persistent storage. The key material for AWS KMS is generated from KMS keys. AWS KMS generated key material for AWS KMS keys in FIPS 140-2. ●Scaling Encryption at Rest Capabilities With AWS KMS AWS KMS integrated with the AWS services for encrypting the data at rest and it is required to facilitate signing and verification with the use of AWS KMS key. It can help in protecting data at rest and it requires integration of AWS services that uses envelope encryption where the data key is used for encrypting the data. The verification and signing includes integrated of the AWS services that uses key pairs along with asymmetric KMS key in AWS KMS. The AWS managed KMS key can be created for management of inventory and it is required for receiving the recording of using AWS CloudTrail. AWS KMS is the fully managed service and the use of encryption growth includes service that can automatically scale to meet the need. It enables the management of the KMS keys in the account for defining the default limits of request rates and number of keys (Kathuria et al. 2018). The KMS keys can be created on their behalf by other AWS services. It ensures that key and data is highly available and it stores multiple copies for the encrypted version of the key in the system that is designed for ensuring durability. Importing keys into services can help in maintaining and securing the copy of the KMS keys. The use of custom key store features for creating the KMS keys in the AWS cluster and provides encrypted copies of keys that are backed up automatically and provides complete control over the process of recovery. AWS KMS is designed for providing highly available services with the regional API endpoint. AWS KMS is designed for retrieving the plaintext keys that are present in the service. The service uses hardware security models that can validate under FIPS 140-2 that are being validated for protecting the integrity and confidentiality of keys (Kamal et al. 2020). The plaintext key can never be written to disk and it is used in the volatile memory of HSM for the time needed for performing the requested cryptographic operation. AWS KMS provides the capability for using and creating KMS keys and data key pairs. It can be designated to the KMS key that uses the signing of key pair and it is requested for the KMS key for using local application. It provides the firmware for controlling multi-party access control, which the independent group reviews and audits. ●Summary The key material includes plaintext inside the HSMs and it provides with key material that is being used and generated for the cryptographic operations. The optional features can help in importing the key materials for the KMS key. The imported key material should be encrypted using the RSA key pair that is generated in the AWS KMS. The imported key material is decrypted on the AWS KMS. AWS KMS is in charge of protecting and storing AWS KMS keys, as well as logical entities backed by cryptographic key materials. The distributed fleet of FIPS 140-2 supports key materials for KMS keys. High Availability of Data in the Cloud ● Overview The important aspect of the migration to AWS is traffic cutover and switching. It includes best practices that can add to application migration plans that can be achieved for the goal of zero downtime migration. It offers the complete range of cloud services that includes migration to the cloud. It can help in assisting the workload migration and it can help in choosing the best option for the organization. In the case of BallotOnline, it provides with architecture that can help in reaching its limits in the term of capacity and scalability for adoption of the expansion strategy. ● Near-Zero Downtime Migrations to AWS It provides an option for workload migration of applications and infrastructure. It will be migrating the workload that exists in the datacenter. It can help in exploring different patterns for migration. It can shift and lift workloads using cloudEndure. It is required for understanding and reviewing the way to migrate the working and consist of database and application server.
Nov 26, 2022
SOLUTION.PDF

Get Answer To This Question

Related Questions & Answers

More Questions »

Submit New Assignment

Copy and Paste Your Assignment Here