skip to content

Latest News

Optusnet email issues - update

In a victory of sorts, Optust have acknowledged that the reason for emails from our websites and mail server not being delivered...

Email deliverability – GMX

There appears to be a problem recently with deliverability of website emails to mailboxes at GMX: * gmx.com * gmx.at * gmx.net...

New email hosting package available

Through our new Dreamscape portal you can now subscribe to a hosted email package for your domain, at very reasonable prices....

New Dreamscape Domain Portal

For those of you with domain names registered through Chirp's Dreamscape account, the new portal can be accessed at domains.chirpinternet.eu...

Setting up domain-based email hosting

The process involves setting up mailboxes that your people can access to read and send emails using your domain. Step 1...

Optusnet maili delivery issues

Our recent upgrade to Debian 12 (bookworm) included a tightening of security around TLS connections used for logging in and...

News RSS Feed

more news

Case study:Secure file storage with AWS

4 October 2022

One of our European Union clients wanted a system that allowed their members to upload files and to have them stored in a secure GDPR-compliant location.

The solution we opted for was AWS S3 with full encryption, versioning and no public access other than using single-use time-limited URLs for downloading individual files.

Files are uploaded using the handy AWS CLI command-line interface, storing each file with a unique id, but also including the file name and type as headers:

aws s3api put-object
--body {$body}
--bucket {$bucket}
--content-disposition 'attachment; filename=\"{$filename}\"'
--content-type {$filetype}
--key {$key}

When a file is requested for downloading a pre-signed URL is generated and immediately used to trigger the download:

aws s3 presign s3://{$bucket}/{$key} --expires-in {$expires}

Some of the trickier parts of this project were: crafting policies for IAM and S3 to limit and secure access to the bucket; having the files download with the original file name and type despite being stored as unique identifiers; and setting up triggers to rename and delete uploaded files in response to website actions.

If you are interested in this kind of solution for your website or members, the steps are:

  • set up an AWS S3 account and bucket;
  • set up an IAM user for API access;
  • put in place the necessary AWS policies; and
  • configure privacy, security and data retention settings.

A similar system could also be a good for storing backups of your website files or keeping long term historical logs.

Chirp is already using AWS for keeping incremental encrypted backups.

Creating a multilingual website »

« Customisable online quiz


< case studies