AWS Certified SysOps Administrator Official Study Guide. Cole Stephen
Читать онлайн книгу.between programming languages that developers use to compile executable code and scripting languages that administrators use to perform infrastructure tasks. That leads into the next reason why we’re talking about SDKs: The line between development and operations is increasingly blurry. As operations and development responsibilities merge into the new world of DevOps, it’s important for those in charge of operations to understand the basics of how applications integrate with infrastructure.
AWS Certification Paths
There are three paths that an AWS Certification candidate can take toward Professional status: Architecting, Developing, and the one you’re focusing on by reading this book, Operations. It’s worth noting that while the Architecting path has its own professional certification (the AWS Certified Solutions Architect – Professional), the Developing and Operations paths share the same professional credential: the AWS Certified DevOps Engineer certification.
As the differentiation between Development and Operations becomes increasingly blurry, it’s important for both groups to understand what the other does on a daily basis. Hence, the SysOps and Developer paths merge at the Professional level.
It’s through the AWS SDKs and the APIs that underlie them that applications built on AWS can manage infrastructure as code. The concept of infrastructure as code is powerful, disruptive, and sets the cloud apart from the old IT world.
At the time this book was written, AWS SDKs were available for the following programming languages:
■ Android
■ Browser (JavaScript)
■ iOS
■ Java
■ .NET
■ Node.js
■ PHP
■ Python
■ Ruby
■ Go
■ C++
There are also two purpose-specific SDKs:
■ AWS Mobile SDK
■ AWS IoT Device SDK
The language-specific SDKs contain APIs that allow you easily to incorporate the connectivity and functionality of the wider range of AWS Cloud services into your code without the difficulty of writing those functions yourself. Extensive documentation accompanies each SDK, giving you guidance as to how to integrate the functions into your code.
We focus on the AWS SDK for Python as our reference SDK for this chapter.
The AWS SDK for Python is also known as Boto. Like the other AWS SDKs and many of our tools, it is available as an open source project in GitHub for the community to view freely, download, and branch under the terms of its license. There is an active Boto community, including a chat group, which can help answer questions. Let’s get started by installing Boto and jump right into using it.
AWS and Open Source
AWS has been committed to the idea of open source software since day one. Open source code allows customers to review code freely and contribute new code that is optimized or corrected. AWS not only uses open source software, such as Xen, SQL, and the Linux operating system, but often contributes improvements to various open source communities.
Given that Boto is an SDK for Python, it requires Python to be installed prior to its own installation. The method of doing so depends on the operating system involved. You can find more information about installing Python at http://www.python.org/. Another prerequisite is pip, a Python tool for installing packages, which can be found at https://pip.pypa.io/.
After installing Python and pip, you install Boto using the following command:
pip install boto3
It’s worth noting the boto3 at the end of the install command. The current version of the Boto SDK is 3. Although Boto 2 is still in use, we highly encourage customers to use Boto 3. Throughout the rest of this chapter, when we refer to “Boto,” we are referring to Boto 3.
By default, Boto uses the credential files that you established in setting up the AWS CLI as its own credentials for authenticating to the AWS API endpoints.
Boto contains a variety of APIs that operate at either a high level or a low level. The low-level APIs (Client APIs) are mapped to AWS Cloud service-specific APIs. The details of how to use the low-level APIs are found in the Boto 3 documentation at https://boto3.readthedocs.io/en/latest/guide/clients.html. Although the low-level APIs can be useful, we suspect that those involved in systems operations will not often need to dig into the specifics of their use.
The higher-level option, Resource APIs, allows you to avoid calling the network at the low level and instead provide an object-oriented way to interact with AWS Cloud services. We’ll cover the use of Resource APIs in more detail next.
Boto also has a helpful feature called the waiter. Waiters provide a structure that allows for code to wait for changes to occur in the cloud. For example, when you create a new Amazon EC2 instance, there is a nominal amount of time until that instance is ready to use. Having your code rely on a waiter to proceed only when the resource is ready can save you time and effort.
There is also support for multithreading in Boto. By importing the threading module, you can establish multiple Boto sessions. Those multiple Boto sessions operate independently from one another, allowing you to maintain a level of isolation between the transactions that you’re running.
These are just a few of the features Boto offers. For an in-depth look at these features, or to learn more about other features available in this SDK, refer to the Boto General Feature Guides at https://boto3.readthedocs.io/en/latest/guide/index .html#general-feature-guides.
If you want to use Boto in your Python code, you start by importing the Boto SDK:
import boto3
And, if you’re using Interactive mode, you then press Enter.
To start using the Boto class in Python, invoke it by calling boto3.resource and passing in a service name in single quotes. For example, if you wanted to perform an action using Amazon EC2, you would execute something similar to the following:
ec2 = boto3.resource('ec2')
You now have an object called ec2, which you can use to act on Amazon EC2 instances in your AWS account. You can then instantiate a method object pointing to a specific instance in your Amazon Virtual Private Cloud (Amazon VPC):
myinstance = ec2.Instance('i-0bxxxxxxxxxxxxxxx')
Perhaps you want to stop an Amazon EC2 instance programmatically, possibly at the end of a shift to save money when it’s not being used. You can then issue a command to stop that instance:
myinstance.stop()
You can start the instance back up again automatically prior to the beginning of the next shift:
myinstance.start()
Acting on infrastructure resources isn’t your only option with Boto. You can also retrieve information from the APIs about your resources. For example, perhaps you want to find out what AMI is being used for the instance in question. You can do that by passing the following command:
instance.image_id
A string is returned for this command containing the AMI ID of the named instance.
We’ve been covering the language-specific AWS SDKs, which focus on the management