Amazon S3 Bucket Meta Information with Python

In today’s example we are going to see how we can see the Amazon S3 bucket metainformation with Python. That is, the information that Amazon saves for each of the stored items. For this we will use the tinys3 library.

The first thing will be to establish the access keys to Amazon S3:

S3_ACCESS_KEY = 'BAKIBAKI678H67HGA'
S3_SECRET_KEY = '+vpOpILD+E9872AialendX0Ui123CKCKCKw'

The next thing is to connect with our Amazon S3 system using the .Connection() method

conn = tinys3.Connection(S3_ACCESS_KEY,S3_SECRET_KEY,'the-test',endpoint='s3-eu-west-1.amazonaws.com')

Once connected we are going to list the files that we have in a directory as we saw in list content of an amazon s3 bucket with Python.

list = conn.list('')

for file in list:
    print file

In each of the file variables we will have the meta-information of the Amazon S3 bucket in a dictionary Python. So we will only have to indicate the dictionary keys to see the stored information.

Within this meta-information we have:

  • key, file key.
  • storage_class, file storage type.
  • last_modified, date of the last modification of the file.
  • etag, information about the etag associated with the file.
  • size, size in bytes of the file.

Thus the code will look like this:

for file in list:
    print file
    print 'Key: ' + file['key']
    print 'Storage Type: ' + file['storage_class']
    print 'Modification Date: ' + str(file['last_modified'])
    print 'ETag: ' + file['etag']
    print 'Size: ' + size(file['size']) + ' bytes'
    print 'Size: ' + str(file['size']) + ' bytes'

In this way we will have managed to display the Amazon S3 Bucket meta-information on the screen with Python.