Moved blog from MySQL to Postgres

Due to resource taxing, the blog is moved from MySQL to postgres.



And the process:







rootnl2k



Posts: 2

Joined: Mon Jul 16, 2012 11:15 pm



Re: Moved from MySQL to PostGres



Postby rootnl2k ยป Tue Feb 11, 2014 10:15 am

Correct you are Gavin.



So 4 steps.



1) from http://en.wikibooks.org/wiki/Converting_MySQL_to_PostgreSQL#Export_using_CSV-files





When you have a large sql dump and a binary data inside, it will be uneasy to modify the data structure, so there is another way to export your data to PostgreSQL. Mysql have an option to export each tables from database as separate .sql file with table structure and .txt file with table's data in CSV-format:



mysqldump -u username -p -T/path/to/export databasename



Notice that /path/to/export should be writeable by user who runs mysqld, in most case it mysqld.



2) Create a new Serendipity instance specifying posthresql



3) From http://en.wikibooks.org/wiki/Converting_MySQL_to_PostgreSQL#Export_using_CSV-files



After that you should modify your table structure according PostgreSQL format:



convert data types

create separate keys definitions

replace escape characters



When table structure will be ready, you should load it as it was shown above. You should prepare data files: replace carriage return characters to "\r" and remove invalid characters for your data encoding. Here is an example bash script how you can do this and load all the data in your database:



#!/bin/bash



CHARSET="utf-8" #your current database charset

DATADIR="/path/to/export"

DBNAME="databasename"



for file in `ls -1 $DATADIR/*.txt`; do

TMP=${file%.*}

TABLE=${TMP##*/}

echo "preparing $TABLE"



#replace carriage return

sed 's/\r/\\r/g' $file > /tmp/$TABLE.export.tmp



#cleanup non-printable and wrong sequences for current charset

iconv -t $CHARSET -f $CHARSET -c < /tmp/$TABLE.export.tmp > /tmp/$TABLE.export.tmp.out



echo "loading $TABLE"

/usr/bin/psql $DBNAME -c "copy $TABLE from '/tmp/$TABLE.export.tmp.out'"



#clean up

rm /tmp/$TABLE.export.tmp /tmp/$TABLE.export.tmp.out

done



4) If the index does not work and you see duplicate key violates unique constraint



Look at http://hcmc.uvic.ca/blogs/index.php?blog=22&p=8105&more=1&c=1&tb=1&pb=1



How to fix PostgreSQL error "duplicate key violates unique constraint"

If you get this message when trying to insert data into a PostgreSQL database:





ERROR: duplicate key violates unique constraint



That likely means that the primary key sequence in the table you're working with has somehow become out of sync, likely because of a mass import process (or something along those lines). Call it a "bug by design", but it seems that you have to manually reset the a primary key index after restoring from a dump file. At any rate, to see if your values are out of sync, run these two commands:





SELECT MAX(the_primary_key) FROM the_table;



SELECT nextval('the_primary_key_sequence');



If the first value is higher than the second value, your sequence is out of sync. Back up your PG database (just in case), then run thisL





SELECT setval('the_primary_key_sequence', (SELECT MAX(the_primary_key) FROM the_table)+1);



That will set the sequence to the next available value that's higher than any existing primary key in the sequence.



Please a \dt might be needed to see what the_primary_key_sequence is.