Here lately, I have had to produce several reports from psql. If you’re anything
like me, Then you’re all about automating automatable things. And you’re also
probably all about simplicity. So I turned to my old friends,
First of all - you will probably want to wrap a thing like this in a shell
script. I use bash, so my
~/psql_report_script.sh would be like this:
#!/bin/bash PGPASSWORD=$YOUR_ENVIRONMENT_VARIABLE psql -d <database> -h <host> -U <username> -c 'copy (select something from your_tables) TO STDOUT WITH CSV HEADER' -o /path/to/your/file.csv
Now chmod +x that file, and let’s break that command down just a little.
historythen put a space before it.
COPYcommand.[more on that here](http://www.postgresql.org/docs/current/static/sql-copy.html). This is pretty much the guts of the command. If your SQL is complicated, consider using the -f argument to pass in a file containing your sql command. In our COPY command we're telling psql to copy the result of the query to STDOUT and in a CSV format with headers
I like cron. It’s the easiest, fastest way to make things happen on a schedule
on a linux machine. Here’s how to set it up. TLDR: type in
crontab -e to edit your crontab file, and read the top
line. From the how to set it up link:
minute (0-59), hour (0-23, 0 = midnight), day (1-31), month (1-12), weekday (0-6, 0 = Sunday)
The last part of the file is your command. So if I wanted this to run every day at 2am, my crontab line would look like this:
01 02 * * * /home/my_user/reports/psql_report_script.sh 30 02 * * * scp /path/to/your/file.csv some_user@mysftserver:/path/to/outfile.csv
Here, I’m also exporting the file out to another server at 2:30am every morning.
As you can see it’s not hard at all to set up a repeatable report extract using simple command-line tools.