Getting Started with JSON API Client Library in Python

The following are the preliminary steps to access Google Cloud Storage by using the JSON API client library in Python.  Basically, you must do the following:

  • Download and install the Python JSON API client library. In a terminal window execute the following command:
    [sudo] pip install --upgrade google-api-python-client
    
  • Enable the use of the JSON API for your Google Cloud Storage project.
  • Set the client authorization information.

See the Quickstart steps described below.

Background

Before an application can use the JSON API, the user must allow access to her Google Cloud Storage private data. Therefore, the following steps must be performed:

  • The application must be authenticated.
  • The user must grant access to the application.
  • The user must be authenticated in order to grant that access.

All of this is accomplished with OAuth 2.0 and libraries written for it.

Important Concepts

  • Scope. JSON API defines one or more scopes that declare a set of operations permitted. When an application requests access to user data, the request must include one or more scopes. The user needs to approve the scope of access the application is requesting.
  • Refresh and Access Tokens.  When a user grants an application access, the OAuth 2.0 authorization server provides the application with refresh and access tokens. These tokens are only valid for the scope requested. The application uses access tokens to authorize API calls. Access tokens expire, but refresh tokens do not. Your application can use a refresh token to acquire a new access token.
    Warning: Keep refresh and access tokens private. If someone obtains your tokens, they could use them to access private user data.
  • Client ID and Client Secret. These strings uniquely identify an application and are used to acquire tokens. They are created for your Google APIs Console project on the API Access pane of the Google APIs Console. There are three types of client IDs, so be sure to get the correct type for your application:
    • Web application client IDs
    • Installed application client IDs
    • Service Account client IDs
    Keep your client secret private. If someone obtains your client secret, they could use it to consume your quota, incur charges against your Google APIs Console project, and request access to user data.

Building and Calling the Service

The following steps describe how to build an API-specific service object, make calls to the service, and process the response.

  1. Build the Service. You use the build() function to create a service object. It takes an API name and API version as arguments. You can see the list of all API versions on the Supported APIs page. The service object is constructed with methods specific to the given API. The following is an example:
    from apiclient.discovery import build as discovery_build
    service = discovery_build('storage', 'v1beta2', ....)
    
  2. Create the Request. Methods are defined by the API. After calling a method, it returns an HttpRequest object. The following is an example:
    request = service.buckets().insert(
                    project=project_id, 
                    body={'name': bucket_name})
    
  3. Get the Response. Creating a request does not actually call the API. To execute the request and get a response, call the execute() function as follows:
    response = request.execute()
    

Quickstart

To best way to get started is to access the documentation Google Cloud Storage JSON API Client Library for Python. Follow the steps in the Quickstart section to create a starter application to get you up and running.  In particular, perform these steps:

  1. Select the platform you want to use. For simplicity, select Command Line.
    Command Line Application

    Command Line Application

  2. Click the Configure Project button. A dialog window is displayed.
  3. In the dropdown list select the name of the project for which you want to enable the JSON API.
  4. Click the Continue button. An instruction list is displayed. For convenience the instructions are repeated below.
    • Install Python 2 (if not installed yet).
    • Download the starter application and unzip it. Notice, you must download the application from the link shown in the instruction list.
    • Download the client secrets file. Use it to replace the file included in the starter application. Notice, you must download the client secrets file from the link shown in the instruction list.
  5. In a terminal window, from within the application directory, run the application as follows:
    python sample.py
    
    The first time you run the application, a Web page is displayed asking you to allow access to your Google Cloud Storage. Click the Accept button,
    Allow Project Access

    Allow Project Access

    The first time you will get the following output:

    Authentication successful.
    Success! Now add code here.

You are up and running! At this point you will want to add Cloud Storage API calls to the sample.py file as shown below.

Analyzing Sample.py

In this section we analyze the sample.py code to highlight the key points. In essence sample.py shows how to set up the OAuth 2.0 credentials to access a project. Notice the code shown is slightly different from the downloaded one. This is to make it more readable.

The following line obtains the path to the client_secrets.json. This file contains the credentials (OAuth 2.0 information) the sample.py needs to access your project. You can download this file from the Cloud Console at this location: <https://cloud.google.com/console#/project/[your project ID]/apiui>

CLIENT_SECRETS = os.path.join(os.path.dirname(__file__), 'client_secrets.json')

Next you set the Flow object to be used for authentication. The example below add two scopes, but you should add only the scope you need. For more information on using scopes. see Google+ API Best Practices.

RW_SCOPE = 'https://www.googleapis.com/auth/devstorage.read_write'
RO_SCOPE = 'https://www.googleapis.com/auth/devstorage.read_only'
FC_SCOPE = 'https://www.googleapis.com/auth/devstorage.full_control'

FLOW = client.flow_from_clientsecrets(
  CLIENT_SECRETS,
  scope=[RW_SCOPE, RO_SCOPE], 
  message=tools.message_if_missing(CLIENT_SECRETS))

The following lines are critical. If the credentials (storedcredentials.json) don’t exist or are invalid the native client flow runs. The Storage object will ensure that if successful the good credentials will get written back to the file.

storage = file.Storage('storedcredentials.json')
credentials = storage.get()
if credentials is None or credentials.invalid:
  credentials = tools.run_flow(FLOW, storage, flags)

Customization

Add the following function to list the objects contained in a bucket.

from json import dumps as json_dumps

def listObjects(bucketName, service):
    print 'List objects contained by the bucket "%s".' % bucketName
    fields_to_return = 
      'nextPageToken,items(bucket,name,metadata(my-key))'
    request = service.objects().list(bucket=bucketName, 
                fields=fields_to_return)
    response = request.execute()
    print json_dumps(response, indent=2)

Call the function from main as follows:

listObjects('myBucket', service)

RELATED ARTICLES

Build a Google Cloud Storage Boto Application

This post demonstrates how to build a Google Cloud Storage Boto application which interacts with the storage service through the XML API (using the boto library). This is a simple application to help you understand the API syntax (and semantics) when interacting with Google Cloud Storage.  It uses a simple interface which allows you to perform tasks such as: list the buckets in a project, list objects in a bucket, create a bucket, create an object and so on.

BotoXmlApi Application

BotoXmlApi Application

The boto library is written in Python, therefore the application is built using the same programming language.

To build the application, you will use Eclipse as shown next. Before you can perform the shown steps, assure that you have satisfied the requirements described here: Google Cloud Storage Python Applications Prerequisites.

Create the Application Project

  1. Activate Eclipse.
  2. From the File menu, select New->PyDevProject. The PyDev Project window is displayed.
  3. In the Project name: enter BotoXmlApi.
  4. Accept the user default workspace; for example: /Users/[your username]/[your work directory]/BotoXmlApi. This is the directory where the program will be stored.
  5. If you have more than one Python version installed perform the following steps:
    • In the Grammar Version selection list select 2.7.
    • In the Interpreter selection list select the desired Python interpreter. For more information, see Configure PyDev.
      PyDev Project

      PyDev Project

  6. Click Finish. The BotoXmlApi project is created and displayed in the Package Explorer.
  7. If the PyDev perspective is not selected yet, click on the Open Perspective icon in the upper right corner of Eclipse. From the pop-up menu select PyDev and click OK.
  8. In the Package Explorer, expand the BotoXmlApi project node.
  9. Right-click the src folder.
  10. Select New->PyDev Module.
  11. In the pop-up dialog window enter the following information:
    • In the Package box enter gcs_examples.
    • In the Name box enter BotoXmlApiMain.
    PyDev Module

    PyDev Module

  12. Click Finish.
  13. Select Module Main and  click OK.
    An empty BotoXmlApiMain.py is created. Actually the module contains the standard Python check for main because the template chosen. You’ll enter the code needed to interact with Google Cloud Storage in the next sections. But before you do that you must configure the project to access some needed external libraries.

Configure External Libraries

To build the BotoXmlApi application you need to configure the required libraries. These libraries are contained in the gsutil folder that you obtained when installing the gsutil tool. For more information, see Install gsutil Tool.

    1. For convenience, create a string substitution variable which point to the gsutil directory, as follows:
      • In the Package Explorer, right-click on BotoXmlApi project name.
      • In the displayed selection window, click Properties. The Properties for BotoXmlApi window is displayed.
      • In the left pane, click PyDev-PYTONPATH.
      • In the right pane, click String Substitution Variables button.
      • Click the Add Variable tab. A dialog window is displayed. Perform the following steps:
          • In the Name box enter gsutildir.
          • In the Value box enter the path to the gsutil folder. Click the Browse button and select the VERSION file in the gsutil directory. Then delete VERSION in the path and keep the rest.
        {gsutildir} variable

        {gsutildir} variable

        • Click OK.
    2. Click the External Libraries tab.
    3. Click Add based on variable button.
    4. Add a reference to the boto library by entering ${gsutildir}third_party/boto.
    5. Repeat step 3 and 4 to add the rest of the libraries as shown in the following picture. Click the picture to enlarge it.
External Libraries

External Libraries

BotoXmlApi Implementation

Finally, let’s get our hands dirty and dig into the actual code.
The best way to do this is to:

  1. Download the code here: storage-xmlapi-python.
  2. Copy and paste the various modules in your project.

The application contains the following modules:

  1. BotoXmlApiMain.py. It contains the main function which is called when the application is executed. Provides a simple user interface to exercise Google Cloud Storage XML API.
  2. Project.py. It contains the functions to perform project wide operations.
  3. Buckets.py. It contains the functions to perform bucket operations in the specified project.
  4. Objects.py. It contains the functions to perform object operations in the specified project.

Before running the application, study the code and read the comments to get familiar with the application functionality and the way it is built.

Run the Application in Eclipse

  1. In the Package Explorer right-click BotoXmlApiMain.py.
  2. In the popup window select Run As-> Run Configurations. The Run Configurations window is displayed.
  3. In the Run Configurations window click the Arguments tab.
  4. In the Program arguments box enter the following arguments:
    • –proiect_id [your Google Cloud Storage project ID].
    • –bucket_name [name of your bucket].
    • –object_name [name of your object].
    • –debug_level [0|1|2] message debug level. If this argument is
      missing, the default value of zero (no messages) is used.
  5. Click the Apply button.
  6. In the Package Explorer, right-click BotoXmlApiMain.py.
  7. In the popup window select Run As->Python Run.
  8. In the Eclipse console window enter p1. This display the
    buckets in the specified project.
  9. Clear the output by entering any key.
  10. Select any entry from the menu such as b1 to list the objects
    in the specified bucket.
  11. Clear the output by entering any key.
  12. Make other selections from the menu to exercise the XML API.
  13. Enter x to exit the application.

Run the Application from the Command Line

To run the application from the command line you must configure your environment properly. Remember that you are going to leverage the libraries that came with the gsutil tool.  For more information, see Install gsutil Tool.
Perform these steps:

  1. Include gsutil, boto, and third_party libraries in your PYTONPATH by adding this line to the .bash_profile file:
    export PYTHONPATH=${PYTHONPATH}:${TOOLS}/gsutil:${TOOLS}/gsutil/third_party/boto:${TOOLS}/gsutil/gslib

    Replace ${TOOLS} with the directory where you installed gsutil (by default it is ${HOME}). For example, add this in your .bash_profile: TOOLS=”/Users/[username]/Tools”. Replace username with an applicable value.

  2. If you do not have virtualenv installed, from a Terminal window, install it as follows: pip install virtualenv.
  3. Create (optional) a directory where to keep your virtual environments, for example:
    mkdir Work/Programming/Python/BotoEnv.
  4. Create the virtual environment, for example:
    virtualenv Work/Programming/Python/BotoEnv/BotoXmlApi
  5. Switch to the virtual environment directory, for example:
    cd Work/Programming/Python/BotoEnv/BotoXmlApi.
  6. Activate the virtual environment as follows: virtualenv activate.
  7. Perform the following installs which allow the application to reference the needed libraries:
    sudo pip install ${TOOLS}/gsutil/third_party/python-gflags
    pip install  ${TOOLS}/gsutil/third_party/socksipy-branch
    pip install  ${TOOLS}/gsutil/third_party/httplib2
    pip install  ${TOOLS}/gsutil/third_party/retry-decorator
    pip install  ${TOOLS}/gsutil/third_party/google-api-python-client
  8. Copy all the Python modules (BotoXmlApi.py, Project.py, Buckets.py, Objects.py, Playpen.py) in your virtual environment directory.
  9. At the Terminal command prompt enter the following:
    python BotoXmlApiMain.py --project_id [project ID] --bucket_name [bucket name] --object_name [object name] --debug_level [0|1|2]

    Replace the arguments with your values.
    This displays the application menu which allows you to exercise the desired API.

Have Fun!!

Google Cloud Storage Python Applications Prerequisites

This post describes Google Cloud Storage python applications prerequisites. To this end we’ll use the Python programming language, MAC OS 10.x platform and Eclipse IDE. Later we’ll focus on other programming languages and/or platforms.

Install Python

Remember that Mac OSX 10.x platform has Python already installed.  You must install Python version 2.7.x instead, for example from here: http://www.python.org/download/releases/2.7.6/.

Apple uses its own version of Python and proprietary extensions to implement some of the software distributed as a part of Mac OS X. Unless you know what you are doing, do not mess with it.

After the installation, make sure that the environment variable PATH is properly set in the ~/.bash_profile as follows:

PATH="/Library/Frameworks/Python.framework/Versions/2.7/bin:${PATH}"

Install PyDev in Eclipse

PyDev is a Python IDE for Eclipse. Its latest version requires Java SDK version 7 or above. If you do not have this Java version, install it from here: Java SE Development Kit 7. Then follow these steps:

  1. Activate Eclipse.
  2. In the menu bar click Help->Install New Software.
  3. In the Available Software window click the Add button.
  4. In the Add Repository window click the Add button. Then perform these operations:
    • In the Name box enter PyDev.
    • In the Location box enter http://pydev.org/updates/.
    PyDev Repository

    PyDev Repository

  5. Click OK.
  6. In the combo box that is displayed, check the box by PyDev.
  7. Accept the default selection and click Next.
  8. Accept the terms and conditions, then click Finish.

This install the latest PyDev version.

Configure PyDev

If you have more than one Python version installed, you must configure PyDev to use Python 2.7.x interpreter as follows:

  1. In Eclipse menu bar, click Eclipse->Preferences.
  2. In the left pane of the Preferences window, expand the PyDev node.
  3. Expand the Interpreter node.
  4. Click Python Interpreter.
  5. In the right pane, click the Quick Auto-Config button. This allows to configure the default Python interpreter (i.e., Python 2.7). A selection window is displayed.
  6. Select python. In the lower section you will see the list of Python 2.7 libraries.
  7. Click OK

Create a Google Cloud Storage Project

Select or create a Cloud Storage Project as described here: How to activate Google Cloud Storage. For your convenience, the steps are also described next.

  1. If you do not have a Google account, create one. For more information, see  Create a Google Account.
  2. Activate the Google Cloud Console. Then perform the following steps:
    • If you already have a project, select it.
    • If you do not have a project, create one. Notice the project ID. You will use it often to perform the Google Cloud Storage operations.
  3. Enable Google Cloud Storage for the project as follows:
    • In the console left pane, expand the APIs & auth node and select APIs.
    • In the console right pane, turn the button by Google Cloud Storage from Off to On.
  4. Enable billing.
  5. In the left pane click Settings.
  6. In the right pane click Billing Account for [your project name] and perform these steps:
    • Set your billing profile.
    • Select your modality of payment.
    • Submit and activate the account.
    • Assure that your main e-mail is verified so you can receive billing information.

That’s it. Now you are ready to use the service.

Enabling billing does not necessarily mean that you will be charged. For more information, see Pricing and Terms.

Let’s verify that you can use the service with the gsutil tool. Assure that you have installed the tool first. For more information, see Install gsutil Tool.

  1. Open a Terminal window.
  2. At the command prompt enter: gsutil mb gs://<unique bucket name>. This creates a  bucket in your project using the default region. Notice that the bucket name must be unique in the entire Google Cloud Storage name space.
  3. To verify that the bucket has been created, at the command prompt enter: gsutil ls. The bucket name you just created will be listed.

For a complete list of gsutil commands and related syntax, see gsutil Tool.

Install gsutil Tool

To install the tool follow the steps described next. For more information, see Install gsutil.

  1. Download gsutil.tar.gz.
  2. Extract the archive in the directory of your choice as follows: tar xfz gsutil.tar.gz -C ~/myDir. If you do not specify the target directory the tool installs in your  ${HOME} directory.
  3. Add gsutil to your PATH environment variable. On MAC add the following to the ~/.bash_profile : PATH=${PATH}:~/myDir/gsutil.
  4. Restart the Terminal.
  5. At the command prompt enter gsutil. You should get the tool help.
  6. At the command prompt enter: gsutil config.  A link is displayed. This is to configure the tool with security information so it can access your project.
  7. Open a new browser session and go to the link obtained in the previous step.
  8. Click the Accept button. An access code is displayed.
  9. Copy the access code and enter it in the Terminal window.
  10. Enter your project ID. The gsutil creates the  .boto configuration file that contains information such as security data needed when performing Google Cloud Storage operations.

Build Google Cloud Storage Client Applications

You can build Google Cloud Storage client applications selecting one of the supported RESTful APIs. Google Cloud Storage (GCS) supports 2 kinds of APIs as described next.

XML API

The XML API is the first API created by the GCS development team. It uses the HTTP protocol with the payload in XML format.

XML API Context

XML API Context

This API is used by current and earlier applications mainly written in Python using the boto library and in Java using other libraries such as JetS3t.

The XML API v1.0 is interoperable with some cloud storage tools and libraries that work with services such as Amazon Simple Storage Service (Amazon S3) and Eucalyptus Systems, Inc.

The following Python code snippet shows how to list the buckets contained in a project using the boto library. In future posts, we’ll show you how to exercise other parts of the XML API using the same library.

def list_buckets(project_id, debug_level):
    '''
    Perform a GET Service operation to list the buckets 
    contained in the specified project.
    @param project_id: The id of the project that contains 
    the buckets to list.
    @param debug_level: The level of debug messages to be printed.
    '''
    try:
        # URI scheme for Google Cloud Storage.
        GOOGLE_STORAGE = "gs"

        # Define the project URI
        uri = boto.storage_uri("", GOOGLE_STORAGE, debug_level)
        
        # Define the header values.
        header_values = {"x-goog-api-version": "2",
                         "x-goog-project-id": str(project_id)}

        # List the buckets in the projects.
        for bucket in uri.get_all_buckets(headers=header_values):
            print bucket.name

    except boto.exception, e:
        logging.error("list_buckets, error occurred: %s", e)

For testing purposes, you can use XML API directly with the curl tool.

JSON API

The JSON API is the second API created by the GCS development team. It uses the HTTP protocol with the payload in JSON format. At the moment, this API is still in the experimental stage.

JSON API Context

JSON API Context

JSON format is poised to become the standard way to communicate with any Google cloud service. Even though the details may differ from one service to another, once you know how to use a certain API, you should be able to apply this knowledge anywhere else.

Examples of using JSON API can be shown from the browser. For example, if you have already a project you can list the buckets from this location:  Bucket:List.

The libraries support several programming languages and this allows for a wider range of applications, compared to XML API for example. For information about the supported languages, see Libraries.

Both XML and JSON API use the HTTP protocol as defined by the HTTP/1.1 specifications and provide a RESTful interface for accessing Google Cloud Storage to perform Create, Read, Update, Delete (CRUD) operations. While the first API uses XML format the second uses JSON format for the payload encoding.

Conclusions

No matter what format you use, you are not going to build your HTTP method calls from scratch. In theory you could get down to the metal and use the protocol directly.  However, instead of creating HTTP requests and parsing responses manually, you may want to use the Google APIs client libraries. 

You could use client libraries such as httplib2 library. But it is advisable to stay with the supported Google libraries. They provide better language integration, improved security, and support for making calls that require user authorization.

 

An Introduction to Cloud Computing

Overview

The Cloud is a computing model for enabling ubiquitous, convenient, on-demand network access to a shared pool of configurable computing resources (e.g., networks, servers, storage, applications, and services) that can be rapidly provisioned and released with minimal management effort or service provider interaction. For more information, see  The NIST Definition of Cloud Computing .

This is a technological breakthrough compared to the traditional approach where resources had to be allocated in advance with the danger of overestimating (or underestimating) the needs.

But, most importantly, in the cloud the allocation is done automatically and in real-time. This is the elasticity attribute of the cloud. The cloud main architectural principle is predicated on delivering IT services on demand. The result is software architectures with qualities such as: elasticityauto-scaling,  fault tolerance and administration automation.

An extension of this is the concept of application as a service usually, a REST web service.For more information about designing for the cloud, see  Cloud Ready Design Guidelines .

From a hardware point of view, three aspects are new in cloud computing:

  • The “infinite” computing resources available on demand, thereby eliminating the need for users to plan far ahead for provisioning
  • The elimination of an up-front commitment by the users, thereby allowing companies to start small and increase hardware resources only when there is an increase in their needs
  • The ability to pay for use of computing resources on a short-term basis as needed (e.g., processors by the hour and storage by the day) and release them as needed, thereby rewarding conservation by letting machines and storage go when they are no longer useful.

You may want to take a look at the following video to understand the difference between cloud and traditional virtualization: Cloud and Virtualization.

Cloud Deployment and Service Models

Deployment models define different types of ownership and distribution of the resources used to deliver cloud services to different customers.

Deployment Models

Cloud environments may be deployed over a private infrastructure, public infrastructure, or a combination of both.

The most common deployment models as defined by the National Institute of Standards and Technology (NIST) include the following:

  • Private cloud. The cloud infrastructure is operated solely for a single organization (client). It may be managed by the organization itself or a third-party provider, and may be on-premise or off-premise. However, it must be solely dedicated for the use of one entity.
  • Community cloud. The cloud infrastructure is shared by several organizations and supports a specific community with shared requirements or concerns (for example, business model, security requirements, policy, or compliance considerations). It may be managed by the organizations or a third party, and may be on-premise or off-premise.
  • Public cloud. The cloud infrastructure is made available to the general public or a large industry group and is owned by a cloud provider (an organization selling cloud services). Public cloud infrastructure exists on the premises of the cloud provider.
  • Hybrid cloud. The cloud infrastructure is a composition of two or more clouds (private, community, or public) that remain unique entities but are bound together by technology to enable portability. Hybrid clouds are often used for redundancy or load-balancing purposes. For example, applications within a private cloud could be configured to utilize computing resources from a public cloud as needed during peak capacity times.

Service Models

Service models identify different control options for the cloud client and cloud provider. For example, SaaS clients simply use the applications and services provided by the provider, where IaaS clients maintain control of their own environment hosted on the provider’s underlying infrastructure. The following are the most commonly used service models:

  1. Software as a Service (SaaS).  It   enables the end user to access applications that run in the cloud. The applications are accessible from various client devices through a thin interface such as a web browser. Some examples are:
    1. gmail
    2. Google docs
    3. Microsoft Office 360
  2. Platform as a Service (PaaS). It enables the deployment of applications in the cloud. These applications are created using programming languages and tools supported by the cloud provider. Some examples are:
    1. Google App Engine
    2. AWS Elastic Beanstalk
  3. Infrastructure as a Service (IaaS). It enables the provisioning of compute processing, storage, networks and other computing resources to deploy and run applications. You cannot control the underlying physical infrastructure though.Some examples are:
    1. Google App Engine
    2. Amazon S3
    3. Google Compute Engine
    4. Google Cloud Storage
    5. Google Big Query

The following picture depicts the service models and the way they stack up:

You can find the above picture and more information at NIST Cloud Computing Reference Architecture.The next picture shows the control and responsibilities for cloud clients and providers across the service models:

Cloud Logical Architecture

The cloud architecture is structured in layers. Each layer abstracts the one below it and exposes interfaces that layers above can build upon. The layers are loosely coupled and provide horizontal scalability (they can expand) if needed. As you can see in the next picture, the layers map to the service models described earlier.

As shown in the previous picture, the cloud architecture contains several layers, as described next.

  • Hosting Platform. Provides the physical, virtual and software components. These components include servers, operating system, network, storage devices and power control and virtualization software. All these resources are abstracted as virtual resources to the layer above.The virtual machine (VM) is at the core of the cloud virtualization. It represents a software implementation of a computing environment in which an operating system and other apps can run. The virtual machine typically emulates a physical computing environment, but requests for CPU, memory, hard disk, network and other hardware resources are managed by a virtualization layer which translates these requests to the underlying physical hardware.
    VMs are created within a virtualization layer, such as a hypervisor that runs on top of a client or server operating system. This operating system is known as the host OS. The virtualization layer can be used to create many individual, isolated VM environments.
  • Infrastructure Services. The important function of this layer is to abstract the hosting platform as a set of virtual resources and to manage them based on scalability and availability. The layer provides three types of abstract resources: compute, storage and network. It also exposes a set of APIs to access and manage these resources. This enables a user to gain access to the physical resources without knowing the details of the underlying hardware and software and to control them through configuration. Services provided by this layer are known as Infrastructure as a Service (IaaS).
  • Platform Services. Provides a set of services to help integrating on-premise software with services hosted in the cloud. Services provided by this layer are known as Platform as a Service (PaaS).
  • Applications. Contains applications built for cloud computing. They expose web interfaces and services and enable multitenant hosting. Services provided by this layer are known as Software as a Service (SaaS).

The vertical bars in the picture represent components that apply to all layers with different degrees of scope and depth. Mainly they support administrative functions, handling of security and cloud programmability (the later supporting the most common programming languages).

Hello Caprese!

This post is to start my new blog, just for fun.

Insalata Caprese

Insalata Caprese

Ingredients

  1. 1 pound cherry tomatoes
  2. 1/2 pound fresh mozzarella balls
  3. A few fresh basil leaves
  4. Extra virgin olive oil
  5. Sea salt and freshly ground black pepper to taste
  6. Balsamic vinegar (just to garnish)

Notes

The way the Italians say it: Caprese. The adjective “caprese” is Italian and means from the island of Capri.  The fresh salad shown above must have been created there or at least made famous by the restaurants catering to the many tourists visiting the island for centuries.

The romans vacationed in Capri. As matter of a fact, you can still visit some of the roman ruins the most famous being the villa of the emperor Tiberius which commands a spectacular view of the Mediterranean sea.

Related articles