CiviCRM Community Forums (archive)

*

News:

Have a question about CiviCRM?
Get it answered quickly at the new
CiviCRM Stack Exchange Q+A site

This forum was archived on 25 November 2017. Learn more.
How to get involved.
What to do if you think you've found a bug.



  • CiviCRM Community Forums (archive) »
  • Old sections (read-only, deprecated) »
  • Support »
  • Upgrading CiviCRM (Moderator: Deepak Srivastava) »
  • Importing Large SQL Files
Pages: [1]

Author Topic: Importing Large SQL Files  (Read 8418 times)

Denver Dave

  • Ask me questions
  • ****
  • Posts: 471
  • Karma: 9
Importing Large SQL Files
June 11, 2010, 09:08:34 am
In trying to upgrade our production linux / drupal server to the latest Civicrm version, we would like to load the 2 million + records on a test server.   I've been experimenting with running our 195 MB backup file into a xampp / php / mysql installation on my PC.

(1) Import times out with phpmyadmin and I haven't been able to fix with the php.ini settings
(2) command line MYSQL also times out
(3) bigdump.php seems to have hit a file upload limit
Warning: POST Content-Length of 199333957 bytes exceeds the limit of 134217728 bytes in Unknown on line 0

With the bigdump, I'm thinking the POST error may be a limitation of the file upload size and perhaps not a limitation with Bigdump and so I'm wondering if the bigdump script could be modified to allow an FTP upload and then read the file directly.

Another possibility is perhaps splitting the SQL file into smaller pieces.

How are others loading big mysql files onto new servers?

= = = = =
later note: I did a test with fopen() and fgets() and was able to read the 195 MB,  1.3 M line myphpmyadmin - export file.  .... so maybe a on-the-server SQL file approach with big dump.
« Last Edit: June 11, 2010, 12:25:18 pm by Denver Dave »

xavier

  • Forum Godess / God
  • I’m (like) Lobo ;)
  • *****
  • Posts: 4453
  • Karma: 161
    • Tech To The People
  • CiviCRM version: yes probably
  • CMS version: drupal
Re: Importing Large SQL Files
June 11, 2010, 10:21:01 am
Mysql command line times out ? What's the error msg ?

X+

-Hackathon and data journalism about the European parliament 24-26 jan. Watch out the result

Denver Dave

  • Ask me questions
  • ****
  • Posts: 471
  • Karma: 9
Re: Importing Large SQL Files
June 11, 2010, 03:25:50 pm
The MySQL command run from a .bat file called from the command line is:

C:\xampp\mysql\bin\mysql.exe -u root hcac2 < "C:\aa_backups\Data_20100602\info_drupal.sql"

The generated error message in the command window is:

ERROR 1064 (42000) at line 1349721: You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near '<br />
<b>Fatal error</b>:  Maximum execution time of 300 seconds exceeded in <b' at line 881

= = = = =
looks like 1,362,848 records out of over 2.3 million were loaded.

Well - isn't that interesting.... checking the bottom of the export file the phpmyadmin backup from the production server reached max execution time and that error at the bottom of the file may be the problem.

Yikes - better get a good backup without the activity files - done.

Now bigdump works except stopped with creating the foreign key into civicrm_mailing_event_queue which I did not download.   ... mysql command also works with same consideration for the index file.

Guess I'd better find a way to backup other than phpmyadmin. 

Thanks for now.

torenware

  • I post frequently
  • ***
  • Posts: 153
  • Karma: 4
Re: Importing Large SQL Files
June 12, 2010, 10:34:53 pm
If you have command line access (and you really need it if you don't have it for a DB that big), mysqldump is your friend.  It's quick, and it's easy.  And it can be automated.

Setting this stuff up on a Windows system takes a braver man than I.  Even using cygwin, it's just not the same.

Denver Dave

  • Ask me questions
  • ****
  • Posts: 471
  • Karma: 9
Re: Importing Large SQL Files
June 15, 2010, 11:41:05 am
I'm not planning on running the production system on my XP PC, but hope to use it to facilitate the upgrade from Drupal / CiviCRM 1.9 to 3.15.   Already, the process showed me that my backups were not complete and I'm now not using phpmyadmin for backups.

I don't really have to run the system, perhaps I could just upgrade?

I'm trying out the Civicrm dbtools 1.2 on the XP PC, but I'm getting the error "Apache HTTP Server has encountered a problem and needs to close.   The issue seems to be with php5ts.dll.   I'm in the process of testing each step to see where the issue is.    On another thread, it was suggested that dbtools might help convert to a later version.

I waffle back and forth between upgrade and upload.   Uploading assigns new id numbers for contact, phone, email, log and probably others, which might make a challenge.

We are a little better at upgrading, I'll also give a try and rotating through the upgrades since 1.9.  Hoped to do on the XP xampp application, but may have to use the test linux server.

Anyone care to share their experiences?

rcrusoe

  • Guest
Re: Importing Large SQL Files
June 17, 2010, 11:37:20 am
DenverDave, I feel your pain. The good news is, it's not really unique to CiviCRM. The bad news is, it's still painful and perhaps moreso that it needs to be for CiviCRM. Part of the problem is infrastructure/tools for db work in this environment, although folk working on code or themes also look for help on tools for analysis and debug. But to do import, export, deduping, data cleaning, integration of external data, and related tasks (variously categorized as ETL, Data Quality, MDM, etc) is not practical with browser-based utilities or quite frankly, CLI based scripts etc unless you're a real wizard with the MySQL command prompt and regular expressions. Until recently I never found any really good tools to use with MySQL, tho the latest releases of MySQL Workbench look pretty good so far. What I've done in the past is simply take a dump of the MySQL db from the web server running the production or public version of the app, and load it on a local copy of MySQL under windoze. Now I can use some familiar industrial-strength tools to analyze the db, and the data, both sources and destination, and  begin to work with it. MS Access and SQL Server work very well for this, much as we all hate M$ and close-source software. From an Access db I can simply use odbc or jdbc and attach the tables in the MySQL db, and do whatever needs to be done. SQL Server has SSIS that provides extensive, sophisticated tools that can handle very large, complex jobs and automate the process for production work if need be. Do your imports, edits, whatever, to the attached MySQL db, and then, dump it from the copy you're running on the windoze box, upload and restore it to the MySQL server on your web site. The only thing the CiviCRM native import/export utilities are really useful for is importing and testing/editing sample files (50 records, say) in order to snapshot the tables and data structures so you can see how things are put together.

The difficulty with all this is having access to the tools, either M$ or other (Informatica, IBM, etc) as they are very expensive and proprietary. There is an open source alternative, Talend, that I found thru a thread on the Civi formums, that looks quite good and I'm doing some testing with it. Open source, uses Eclipse as the base for the workbench which facilitates plugins for things like connectors to specific products and dbs. If anyone has more experience with the Talend tools I'd love to hear from them.

The other major issue is the CiviCRM db structure/schema - my feeling is it's gotten a little tangled and overgrown (over 140 entities I think) and could use some pruning/simplification. Someone posted an ERD of Contacts on the CiviCRM blog recently, done in MySQL Workbench, but if you load the whole db and try to diagram it, it's completely unintelligible. I don't want to be one to point out problems without having a proposed solution, but without some friendly client or funding source to support such an effort, I've not the bandwidth to tackle it in the short term. As I work thru some current import and data integration tasks I'll try to feed back what I find.
Regards,
Howard J

Donald Lobo

  • Administrator
  • I’m (like) Lobo ;)
  • *****
  • Posts: 15963
  • Karma: 470
    • CiviCRM site
  • CiviCRM version: 4.2+
  • CMS version: Drupal 7, Joomla 2.5+
  • MySQL version: 5.5.x
  • PHP version: 5.4.x
Re: Importing Large SQL Files
June 17, 2010, 03:30:42 pm
Quote from: rcrusoe on June 17, 2010, 11:37:20 am
The other major issue is the CiviCRM db structure/schema - my feeling is it's gotten a little tangled and overgrown (over 140 entities I think) and could use some pruning/simplification.

do you have any basis for making the above statement or some examples where tables should be pruned?

saying that it is tangled and overgrown BECAUSE it has 140 (126 to be more precise in 3.2) entities does not seem very logical, IMO

lobo
A new CiviCRM Q&A resource needs YOUR help to get started. Visit our StackExchange proposed site, sign up and vote on 5 questions

Denver Dave

  • Ask me questions
  • ****
  • Posts: 471
  • Karma: 9
Re: Importing Large SQL Files
June 17, 2010, 06:29:36 pm
I would like to say again how much I appreciate all the work that has gone into CiviCRM and now that we are running 3.1x for some situations, it is a big improvement, operationally than v 1.9 and I look forward to having our production system on the new version.

There are a lot of tables, but the few times I've had to do our own queries, I did not find them hard to work with.  Finding the CiviCRM program code that performs certain functions, now that was hard and I had to get help.

Thank you, thank you, thank you.

Note to self and others - don't forget to donate sometimes:
http://civicrm.org/donate


Pages: [1]
  • CiviCRM Community Forums (archive) »
  • Old sections (read-only, deprecated) »
  • Support »
  • Upgrading CiviCRM (Moderator: Deepak Srivastava) »
  • Importing Large SQL Files

This forum was archived on 2017-11-26.