Database migration

Kim Hawtin ryjkavik at
Mon Dec 2 16:10:49 CST 2002

 --- Andrew Reid <andrew.reid at> wrote: 

> I've got a database (well, my employer does, anyway) that is,
> unfortunately, a rather critical part of the business. It supports
> over 350 retail stores nationwide; all the gory details like sales,
> blah, blah, blah.

> To give you an idea of what's involved:
> * There's a Windows-based interface to the database, giving all of our
>   users buttons to press in order to get the reports that they want. 
>   Everything that you'd want to know, particularly financially, about
>   a particular store or the organisation as a whole, seems to live
>   here. 

You could build all the report generation stuff into a series of
Perl scripts and put them on an internal web server - with session
based logins - so they could be accessed by all clients/customers 
inhouse or via dialup.

> * About 30 users in different areas of the country are sent a few CDs
>   containing the entire database, which they then load onto their
>   laptops, each week. Yeuck.

This could be achieved by a similar set of Perl scripts but using
flat text file DBI interface... (include the Perl installer for
windows clients)
> * Another Windows-based utility dials into cash registers weekly,
>   sucking out sales data and feeding it back into the database. I
>   daresay that the entire project (when it does eventually get going)
>   will involve working out a better way of doing that. Making 300+ STD
>   calls each week gets reasonably pricey. I'd reckon that the Internet
>   could be involved.

It would be possible to use an Internet connection over a VPN
to dump updates to a data collection point then update that to
your main database...

> At this stage, I'm particularly interested in hearing about people's
> experiences and recomendations when it comes to database servers. I've
> been thinking PostgreSQL from the Open Source world, and probably
> Oracle from the commercial world, as options.

PostreSQL will scale comforatbly with sensible table design,
even with a couple of million rows per table...

MySQL is fast, but slower with lots of users, 
PostgreSQL does row locking, MySQL does table locking 
so think of speed versus concurrent users.

> My knowledge of Btreive is fairly minimal (and I'd like to keep it
> that way, thank-you-very-much), so I thought I'd ask: has anyone had
> any experience in moving data out of Btrieve and into a more
> heavy-duty SQL server like PostgreSQL or Oracle? How much hair do you
> have left?

I might suggest you have a look at the DBI interface that Perl offers,
IIRR there was talk about it a couple of years ago.

> If anyone has any general comments and/or ramblings, please; I'd love
> to hear from you. It makes it look as though I'm doing something,
> which pleases my boss ;-)

While I am not a fully fledged Perl coder, I would suspect that a
Perl data munging type tool on a windows client would be capacble
of extracting the data from your existing database and then inserting
it into you new shiny SQL DB on a Linux/BSD/Solaris box...



"I'd rather listen to Newton than to Mundie. He 
 may have been dead for almost three hundred years, 
 but despite that he stinks up the room less."
                                        -- Linus Torvalds

Do You Yahoo!?
Everything you'll ever need on one web page
from News and Sport to Email and Music Charts

LinuxSA WWW: IRC: #linuxsa on
To unsubscribe from the LinuxSA list:
  mail linuxsa-request at with "unsubscribe" as the subject

More information about the linuxsa mailing list