PDA

View Full Version : Backup/Restore larger database



verdon
10-02-2003, 09:28 AM
I've got a site a couple weeks from going live (can't give out the URL yet) that is using OR as the engine for an Artist's Association site. Currently, there are about 260 users with appx. 1,000 listings. The db is getting fairly big. (If anyone is curious, OR is still speedy and performing well with that number of listings).

I recently tried, via phpMyAdmin, to backup and restore the db. Backup is fine, but phpMyAdmin seemed to choke on the size of the sql file when I tried to restore from my backup. I ended up having to breakup the sql backup file into a couple of smaller files and restore the db in pieces.

Is there a better tool I could have used than phpMyAdmin for this? It's pretty much all I've used. I've played around with a few commercial mySQL GUI clients on my Mac, but always return to phpMyAdmin.

Thanks,
verdon

the_sandking
10-02-2003, 12:09 PM
I've got a site a couple weeks from going live (can't give out the URL yet) that is using OR as the engine for an Artist's Association site. Currently, there are about 260 users with appx. 1,000 listings. The db is getting fairly big. (If anyone is curious, OR is still speedy and performing well with that number of listings).

I recently tried, via phpMyAdmin, to backup and restore the db. Backup is fine, but phpMyAdmin seemed to choke on the size of the sql file when I tried to restore from my backup. I ended up having to breakup the sql backup file into a couple of smaller files and restore the db in pieces.

Is there a better tool I could have used than phpMyAdmin for this? It's pretty much all I've used. I've played around with a few commercial mySQL GUI clients on my Mac, but always return to phpMyAdmin.

Thanks,
verdon

<plug>
Look at Navicat 5.1 at http://www.mysqlstudio.com
</plug>

They have a trial version of their software available which I use and love..but for the Mac????? You may be stuck with web-based apps for awhile. I have a friend with a Mac that keeps an old PC around just to use a couple of GUI's for doing web-development.

If you are running OS X or better, you can supposedly run XWindows, and I had seen some XWindows-based GUIs for MySQL out there while I was looking for one last year. I never went that way, but you may find something that works looking in that direction.

Learning to use MySQL from the command-line can change your life, and save your arse occasionally. A DB backup takes about 1 second from the command-line, and a restore takes maybe 2 seconds. Not all webhosts allow for shell access, so command-line backup/restore may be of no help either.

I am curious, about how many records per listing in your DB? I'm working on a site with about 1000 listings, but each listing has 50 records in listingDBElements, making that DB table have about 50000 rows. This *does* in fact slow-down listing_browse.php considerably, but I have managed to tweak things here and there so that the users don't fall asleep while waiting for the searches to return results. Since listing_browse does not call upon the UsersDB, having a large number of users should not affect much there.


Good luck!

greengiant
10-02-2003, 12:23 PM
Sandking,

What tweaks did you have to make to listingbrowse? Want to share a database dump with me, so i can do some speedup work?

verdon
10-02-2003, 01:24 PM
Thanks for the tips...

I do use OSX, but not Xwindows. There is a Navicat version for MacOSX that I looked at a while ago, I'll have to revisit it. CLI for doing this sounds like a good idea... I do have shell access, I guess I'll hhave to do a little reading. Would I use curl or something to load a php file full of SQL statements, or would I deal directly with the SQL server?

As to rows per record... I guess that would relate to the number of fields in a listing. 1 field in a record = 1 row in listingsDBElements. I likely have considerably less fileds in my listings.

In specific, I have

listingsDB = 978 rows
listingsDBElements = 8865 rows

UserDB = 286 rows
UserDBElements = 7920 rows

BTW... wouldn't the number of users have some effect on listingsearch.php as (at least in my case) the users name is being queried for and echoed for each listing. Also, in listing_browse.php and in listingsearch.php I have integrated users so I can search by and sort by users, in addition to the standard stuff.

cheers, v

the_sandking
10-02-2003, 03:58 PM
Sandking,

What tweaks did you have to make to listingbrowse? Want to share a database dump with me, so i can do some speedup work?

We're back to that same old problem of if we wish to retain OR's form-based field generation (and flexibility) for listings, we must pile all the records into 2 columns within listingsDBElements (field_name, field_value)

Some of the tweaks, I have done thus far, is to move certain items from listingsDBElements to their own new columns in listingsDB..(logical items such as 'Price' and in my case 'MLS#') I have also crafted an "improved" version of the 'next_prev' function that requires a little less time to gather its info and render it back..
(I will post this function here as a mod after I "un-Sandking" the dependencies for my version. I'm pretty happy with it. Ryan, you have the site address, you can see it there)

The IDX rules require that credit be given to each listing agent, and their office, This requires 4 IDX-specific records, i.e. IDXagent_id, IDXagent_name, IDXoffice_id, IDXoffice_name.. So, next, I am planning to move these records to their own columns in listingsDB as well. 4-less records X 1000 listings is 4000 less rows in listingsDBElements, which means that listing_browse has to loop through 4000 *less* items when looking for the display_on_browse records.

Now the good news, what used to take 15 seconds to display (when the webhost was under heavy load) now only takes 7-8 seconds under similar load conditions..

The bad news, is that the above only applies to "residential listings'. I have to get started importing "Commercial property" "Vacant Land" etc.. which will easily double the size of listingsdb, and who knows how many records in listingsDBElements, because I haven't determined what records to harvest from those IDX DB's yet. It will likely be much less that the 50-each I am grabbing for 'Residential'

Most likely, I will separate these entirely, i.e. commercialDB, commercialDBElements, etc... I already did this for 'rentals' a while back, it works real nice..

BTW, the MySQL query cache feature ROCKS! Too bad I can't use it in production....yet...

If you want a copy of the elements DB, I can send it to you, CSV?

the_sandking
10-02-2003, 04:18 PM
Thanks for the tips...

I do use OSX, but not Xwindows. There is a Navicat version for MacOSX that I looked at a while ago, I'll have to revisit it. CLI for doing this sounds like a good idea... I do have shell access, I guess I'll hhave to do a little reading. Would I use curl or something to load a php file full of SQL statements, or would I deal directly with the SQL server?

As to rows per record... I guess that would relate to the number of fields in a listing. 1 field in a record = 1 row in listingsDBElements. I likely have considerably less fileds in my listings.

In specific, I have

listingsDB = 978 rows
listingsDBElements = 8865 rows

UserDB = 286 rows
UserDBElements = 7920 rows

BTW... wouldn't the number of users have some effect on listingsearch.php as (at least in my case) the users name is being queried for and echoed for each listing. Also, in listing_browse.php and in listingsearch.php I have integrated users so I can search by and sort by users, in addition to the standard stuff.

cheers, v


Indeed it will slow down some, but with only 286 users, that's small potatoes, because you probably don't have more than 20 records associated with each in userDBElements. Also, your search mod is probably just looking for the user's name, which comes straight out of UserDB. My last post may shed some light on my specific issue, and how it *might* overlap to you as your site grows.

The following example is how one might load a TD (tab delimited) text file into mysql from the command line.




> mysql -p &#40;password&#41;

mysql> USE database_name &#40;<-- Use your DB&#41;

mysql> LOAD DATA INFILE "/www/site/admin/IDX/listings-residential.txt" INTO TABLE your_db_table;


MySQL also has backup/restore scripts available from the command line, I forget which ones (Navicat keeps me out of there most days) but the MySQL online documentation (with user comments) is a fantastic resource. (This is why I own 4 SQL7 books and 0 MySQL books)

If the MacOS version of Navicat is .5 as good as the WIN version I have been using, you will be pleased..

verdon
10-03-2003, 10:50 AM
If the MacOS version of Navicat is .5 as good as the WIN version I have been using, you will be pleased..

I checked into Navicat. The current version for MacOSX is 4.2.5 (I think). I realised I had already tried a demo a while back. When I went to revisit it, the demo period had expired. Navicat for MacOSX is fairly well reviewed on versiontracker.com but at $90 USD it's pretty pricey shareware :)

Before I coughed up the money, I thought I'd try a few other alternatives. For any other MACOSX users that may be reading this, CocoaMySQL is freeware and VERY nice. It was intuitive to use, seemed solid, and definately did the job with my larger restore file, at least when working with localhost. I'm going to try it this morning with a remote db server.

regards,