All Questions
15
questions
-2
votes
1
answer
272
views
Storing Blog posts in Amazon s3 storage vs Mysql
I am planning to create a SASS like blogger.com , In future there may be million and millions of posts.What is the best way to store this.
Example of my Mysql columns
Title , Date , author , Content
...
3
votes
1
answer
704
views
Delete AWS RDS S3 Exported Snapshots
So I have created S3 Exports from existing snapshots in RDS. I would like to delete duplicated ones. But I don't see any button to delete them.
I have already deleted the files from the bucket. But ...
1
vote
3
answers
1k
views
mysqldump using a lot of space
I have a server and the / partition is 20GB in size.
Databases are stored in /mnt/mysql-data partition is 500GB in size.
Now here's the problem. Whenever I run mysqldump it fills up / partition to ...
0
votes
2
answers
1k
views
Large mysql dump pipe to s3
Is there any problem in doing this with lots of data?
mysqldump ... | gzip | s3cmd put - s3://bucket/file.sql.gz
The MySQL dump is about 100GB in size. What happens if gzip or s3cmd can't process ...
2
votes
1
answer
113
views
How to secure the db dump taken from daily run script
In one of my mysql slave servers I have written a daily run script, which
1) stops slave, 2) takes a db dump, 3) starts slave again, 4) encrypts it, 5) copies it to my s3-bucket.
I am using aws-cli ...
10
votes
5
answers
12k
views
how to pipe a mysql dump to s3cmd
I want to transfer a mysql dump, compressed, to s3.
I tried:
mysqldump -u root -ppassword -all-databases | gzip -9 | s3cmd put s3://bucket/sql/databases.sql.gz
but then I get:
ERROR: Not enough ...
0
votes
2
answers
2k
views
Upload database backup from mysql to Amazon S3 or Glacier without creating local file
Is there a tool that makes possible to backup a Mysql database to Amazon S3 or Amazon Glacier without having o create a local file with the database contents?
Something like that:
mysqldump -u ...
2
votes
1
answer
1k
views
Amazon S3 Website & MySQL Backup
I have a website (a digital asset management system/gallery - http://www.resourcespace.org) that has a huge amount of images. The total size of the website, including the images is approximately 6gb.
...
6
votes
4
answers
17k
views
Can you restore a MySQL dump file from an S3 bucket to an RDS instance?
I'm investigating running regular MySQL dumps to an S3 bucket as part of a disaster recovery strategy (partly spurred by the current zero cost of inbound data transfer!). In the event of a disaster, I ...
9
votes
9
answers
10k
views
How to store 3 million records in key value format?
We have to store basic information about 3 million products. Currently the info is one 180 mb CSV which gets updated quarterly.
There will be about 30,000 queries per day, but the queries are just a ...
0
votes
1
answer
349
views
Should I use amazon EC2 or S3? [closed]
I am new to Amazon Web Services, and I have read the documentation about the AWS products. The closest products to meet my needs is EC2 and S3.
What I want to do is, I need to host some php files and ...
8
votes
3
answers
6k
views
What's the best practice for taking MySQL dump, encrypting it and then pushing to s3?
This current project requires that the DB be dumped, encrypted and pushed to s3. I'm wondering what might be some "best practices" for such a task. As of now I'm using a pretty straight ahead method ...
2
votes
4
answers
1k
views
Full MySQL BAckup strategy
Here is our MySQL settings :
3 MySQL servers in a "replication-ring" : they're all Master and S1 is slave to S2, which is slave to S3 which is slave to S4.
Up until now we would do snapshots of the ...
3
votes
4
answers
926
views
best practice? Consumer data in MySQL on Amazon EBS (Elastic block store)
This is a consumer app, so I will care about storage costs - I don't want to have 5x copies of data lying about. The app shards very well, so I can use MySQL and not have scaling issues.
Amazon EBS ...
2
votes
3
answers
504
views
What is the best way to backup mysql in s3?
mysqldump is probably not the best idea on running backups every x hours but is the one that we are currently using. The backups are around 150 Megs each so sending it to other machines could waste ...