logo

S3 Filter

← Back to Filter List

S3


Uses boto library to upload content to S3, returns the URL. You can set AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY variables in your system environment (the environment that runs the dexy command) or you can set defaults in your ~/.dexyapis file (these will override the environment): "AWS" : { "AWS_ACCESS_KEY_ID" : "AKIA...", "AWS_SECRET_ACCESS_KEY" : "hY6cw...", "AWS_BUCKET_NAME" : "my-unique-bucket-name" } You can also have a .dexyapis file in the directory in which you run Dexy, and this will override the user-wide .dexyapis file. You can use this to specify a per-project bucket. You can add a date to your bucket by specifying strftime codes in your bucket name, this is useful so you don't have to worry about all your filenames being unique. If you do not set bucket-name, it will default to a name based on your username. This may not be unique across all S3 buckets so it may be necessary for you to specify a name. You can use an existing S3 bucket, a new bucket will be created if your bucket does not already exist.

Aliases for this filter

  • s3
  • botoup

Converts from file formats:

  • .*

To file formats:

  • .txt

Available settings:

SettingDescriptionDefault
add-new-filesBoolean or list of extensions/patterns to match.False
added-in-versionDexy version when this filter was first available.
additional-doc-filtersFilters to apply to additional documents created as side effects.{}
additional-doc-settingsSettings to apply to additional documents created as side effects.{}
api-key-nameThe name of this APIAWS
api-passwordThe password to sign into the API with.None
api-urlThe url of the API endpoint.None
api-usernameThe username to sign into the API with.None
data-typeAlias of custom data class to use to store filter output.generic
document-api-config-fileFilename to store config for a file (can only have 1 per directory, dexy looks for suffix format first.None
document-api-config-postfixSuffix to attach to content filename to indicate this is the config for that file.-config.json
examplesTemplates which should be used as examples for this filter.[]
exclude-add-new-filesList of patterns to skip even if they match add-new-files.[]
exclude-new-files-from-dirList of directories to skip when adding new files.[]
extFile extension to output.None
extension-mapDictionary mapping input extensions to default output extensions.None
helpHelpstring for plugin.Uses boto library to upload content to S3, returns the URL. You can set AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY variables in your system environment (the environment that runs the dexy command) or you can set defaults in your ~/.dexyapis file (these will override the environment): "AWS" : { "AWS_ACCESS_KEY_ID" : "AKIA...", "AWS_SECRET_ACCESS_KEY" : "hY6cw...", "AWS_BUCKET_NAME" : "my-unique-bucket-name" } You can also have a .dexyapis file in the directory in which you run Dexy, and this will override the user-wide .dexyapis file. You can use this to specify a per-project bucket. You can add a date to your bucket by specifying strftime codes in your bucket name, this is useful so you don't have to worry about all your filenames being unique. If you do not set bucket-name, it will default to a name based on your username. This may not be unique across all S3 buckets so it may be necessary for you to specify a name. You can use an existing S3 bucket, a new bucket will be created if your bucket does not already exist.
input-extensionsList of extensions which this filter can accept as input.[u'.*']
keep-originalsWhether, if additional-doc-filters are specified, the original unmodified docs should also be added.False
master-api-key-fileMaster API key file for user.~/.dexyapis
mkdirA directory which should be created in working dir.None
mkdirsA list of directories which should be created in working dir.[]
nodocWhether filter should be excluded from documentation.False
outputWhether to output results of this filter by default by reporters such as 'output' or 'website'.False
output-extensionsList of extensions which this filter can produce as output.[u'.txt']
override-workspace-exclude-filtersIf True, document will be populated to other workspaces ignoring workspace-exclude-filters.False
preserve-prior-data-classWhether output data class should be set to match the input data class.False
project-api-key-fileAPI key file for project..dexyapis
require-outputShould dexy raise an exception if no output is produced by this filter?True
tagsTags which describe the filter.[]
variablesA dictionary of variable names and values to make available to this filter.{}
varsA dictionary of variable names and values to make available to this filter.{}
workspace-exclude-filtersFilters whose output should be excluded from workspace.[u'pyg']
workspace-includesIf set to a list of filenames or extensions, only these will be populated to working dir.None
Filter Source Code
class BotoUploadFilter(ApiFilter):
    """
    Uses boto library to upload content to S3, returns the URL.

    You can set AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY variables in your
    system environment (the environment that runs the dexy command) or you can
    set defaults in your ~/.dexyapis file (these will override the
    environment):

    "AWS" : {
        "AWS_ACCESS_KEY_ID" : "AKIA...",
        "AWS_SECRET_ACCESS_KEY" : "hY6cw...",
        "AWS_BUCKET_NAME" : "my-unique-bucket-name"
    }

    You can also have a .dexyapis file in the directory in which you run Dexy,
    and this will override the user-wide .dexyapis file. You can use this to
    specify a per-project bucket.

    You can add a date to your bucket by specifying strftime codes in your
    bucket name, this is useful so you don't have to worry about all your
    filenames being unique.

    If you do not set bucket-name, it will default to a name based on your
    username. This may not be unique across all S3 buckets so it may be
    necessary for you to specify a name. You can use an existing S3 bucket,
    a new bucket will be created if your bucket does not already exist.
    """
    aliases = ['s3', 'botoup']
    _settings = {
            'api-key-name' : 'AWS',
            'output-extensions' : ['.txt'],
            }
    API_KEY_KEYS = ['AWS_ACCESS_KEY_ID', 'AWS_SECRET_ACCESS_KEY', 'AWS_BUCKET_NAME']

    @classmethod
    def is_active(klass):
        return AVAILABLE

    def bucket_name(self):
        """
        Figure out which S3 bucket name to use and create the bucket if it doesn't exist.
        """
        bucket_name = self.read_param('AWS_BUCKET_NAME')
        if not bucket_name:
            try:
                username = getpass.getuser()
                bucket_name = "dexy-%s" % username
                return bucket_name
            except dexy.exceptions.UserFeedback:
                print "Can't automatically determine username. Please specify AWS_BUCKET_NAME for upload to S3."
                raise
        bucket_name = datetime.now().strftime(bucket_name)
        self.log_debug("S3 bucket name is %s" % bucket_name)
        return bucket_name

    def boto_connection(self):
        if os.getenv('AWS_ACCESS_KEY_ID') and os.getenv('AWS_SECRET_ACCESS_KEY'):
            # use values defined in env
            return boto.connect_s3()
        else:
            # use values specified in .dexyapis
            aws_access_key_id = self.read_param('AWS_ACCESS_KEY_ID')
            aws_secret_access_key = self.read_param('AWS_SECRET_ACCESS_KEY')
            return boto.connect_s3(aws_access_key_id, aws_secret_access_key)

    def get_bucket(self):
        conn = self.boto_connection()
        return conn.create_bucket(self.bucket_name())

    def process(self):
        b = self.get_bucket()
        k = Key(b)
        k.key = self.input_data.web_safe_document_key()
        self.log_debug("Uploading contents of %s" % self.input_data.storage.data_file())
        k.set_contents_from_filename(self.input_data.storage.data_file())
        k.set_acl('public-read')
        url = "https://s3.amazonaws.com/%s/%s" % (self.bucket_name(), urllib.quote(k.key,))
        self.output_data.set_data(url)

Content © 2013 Dr. Ana Nelson | Site Design © Copyright 2011 Andre Gagnon | All Rights Reserved.