I am writing this tutorial because I couldn’t find out how to do this successfully on the internet by doing a Google search and had to figure it out on my own. Sometimes, when doing web development, we have to do that regardless of how much time and how many tries it takes. In the end here is the solution I found to convert my PostgreSQL database into a MySQL database. I am not saying this is the best or only solution in the world to do this, but it is the only one that worked for me. There was online converters that did not work in my case and there are complicated database management utilities you can download and user from the desktop, but they are nearly impossible to use if you don’t have a doctorate in Computer science and about 20 years of experience behind your belt. Anyway, Here is how I did it:
I used phppgadmin to export the tables one at a time in CSV format. To do so, simply open your phppgadmin, go to the table you want, click on “Export” then select “data only” and for “Format”, select “CSV” from the drop-down. Then tick the download option near the bottom and click the “Export” button to download the CSV file to your local machine.
Now open phpmyadmin in your browser and select a database to add the table to or create a new database and select it.
with the database selected, click on the “Import” tab.
Click “Browse” and select the dump file created in step one. If the dump file is larger than 2,000 KiB see the section below for special instructions for larger files.
For “Format of imported file” select “CSV”.
For “Fields terminated by”, enter a comma.
Place a check in the box that says “column names in first row”.
Leave the rest of the options set to default and click the “Go” button.
You will see it created a new table and named it something like Table1. Change the table name according to your needs and you are done. You may need to view the structure and make some changes, but it will work as-is for most situations.
Importing Large files bigger than 2,000 KiB
To import larger files, you have to break up the dump CSV file into sections. Here is how:
Start from the bottom and copy the last 500-3000 lines to the clipboard, depending on how much data is in them. Try 2000 and if it is still over 2,000 KiB, do the math and try again.
Paste the data into a new CSV file.
From the original dump file, go to the first line and copy it and paste it to the first line of the new CSV file you have created.
Save the file as somename.csv.
Follow the above instructions to import the file into a table and repeat as many times as needed.
This is a post on how to rename a table in PostgreSQL from the command line. PostgreSQL is a command line rich database application. It does have GUIs available, but most webmasters find it easier to do most database management tasks from the command line or from the psql command prompt.
Command line command to rename table:
First from a command prompt, type:
You will be given a psql command prompt after using the above command line command. Now enter the following command from the psql command prompt:
How to backup a single postgresql table with pg_dump
Here are the commands I found to work best in postgresql version 8.4:
commands to copy tables to /home/raid temp
#change dir to tmp:
#change user to postgres
#enter the following sql command replacing did_serial with actual table name:
pg_dump -a -t did_serial -f did_serial.sql maximize_master
In order to restore a postgresql .dump file made with page_dump that was done with the -Fc parameters, as in the following example:
pg_dump -Fc mydb > db.dump
To restore a dump that used the above command, use the following pg_restore command at command line
pg_restore -C -d postgres db.dump
Where C parameter means to create table and d parameter is the database. postgres can be any database name in the cluster. You have to provide another database name even though it gets the actual database name from the dump file. Don’t worry, it won’t rename the database postgres, it just needs to postgres database to create the new database for some reason.
I also wanted to try the -O option because ownership of the tables was all messed up so I tried:
pg_restore -O -C -d postgres db.dump
The reason for the -O option was that when I transferred the first time, the database has users and roles that didn’t exist on the new server, so the -O option makes it owner neutral, which seems to have solved the issue.
Tips and tricks:
use the df -h command on a second terminal window to see if the database is loading. On larger databases, like the one I did recently which was a 15 gig backup file, it will hang on the pg_restore command seemingly indefinitely. You can type [df -h] in a second command line interface to monitor progress. Do it once then do it again and compare to see that there is more disc space being used. that is the technique I used to verify that the pg_restore command was working.
Errors – When I did this I got an error saying: pg_restore.bin: [archiver (db)] Error while PROCESSING TOC: pg_restore.bin: [archiver (db)] Error from TOC entry 2672; 2612 4376101 PROCEDURAL LANGUAGE plpgsql postgres pg_restore.bin: [archiver (db)] could not execute query: ERROR: language “plpgsql” already exists Command was: CREATE PROCEDURAL LANGUAGE plpgsql;
but that is just because the database already had the plpsql language in it which is not a big deal and shouldn’t mess up anything, so just ignore the error and patiently wait for pg_restore to finish.
Leave the command prompt alone – It seems like I messed up one restore session by messing around with keyboard when the focus was on the terminal window where I did the pg_restore command from, so to avoid the likelihood of this happening, leave the terminal window alone until it returns a new command prompt indicating that it has finished.
Missing Data – if something went wtong and some tables are missing data. you can do what I did and do a backup for each table that is messed up. Use the following command to pg_dump just the table you need from the original database server:enter the following sql command replacing did_serial with a table name: pg_dump -a -t table_name -f table_name.sql db_name
replace table_name and db_name with the table and database names respectively. Then download the file to your desktop, open it and take out everything before the the copy command and save it. upload to new server and issue following command:
psql db_name < table_name.sqlNOTE: you have to be in the directory of table_name.sql or use the path before it.