Guest
11 Apr 2014 10:35

Hi there

I'm currently using Cobalt to just display data in a table. I have made my own filters and it all works pretty well (I'm new at website building). My problem comes when I want to actually put in data, where I use the CSV import function.

I'm importing into a section with 12 fields. Nothing major there. The data however will eventually become fairly large (maybe 100k to 500k articles), but I can's seem to import more than around 100 articles, before the server times out. I get an error like "Fatal error: Maximum execution time of 30 seconds exceeded in .../libraries/joomla/language/language.php on line 376". I can't access php.ini, as it is a commercial server. Any chance I can fix this by changing some settings in Cobalt? Or is it a "bug" that needs addressing? It must be possible for Cobalt to split the data into pieces it can process and then continue...

Another thing is the lack of being able to mass delete. I have my data in a local database and exports to csv. I would like to be able to delete all records and then import the new ones, as some data needs to be unpublished, which I dont want to do manually. Or is it possible to have a field you import from csv that determines if the record is published? Then I can just update data instead of deleting and inserting new.

Otherwise Cobalt seems great!

Thank you!

Ove

Last Modified: 02 May 2014


Sergey
Total posts: 13,748
11 Apr 2014 12:05

Suggestion to mass delete.

  1. Edit your type.
  2. Save and new.
  3. Deleet aold type and all articles of this type will be cleaned.
  4. Edit section and change type there.

I know it is tricky but we will think somethign for mass actions.

And i added

set_time_limit(0);
ini_set('max_execution_time', 0);

to import process. After next update it should not display this error.


Guest
11 Apr 2014 15:21

Thank you very much! The trick worked like a treat. I tried deleting the sections before, but didn't think of the types. I assume you meant "save as copy" just in case anyone else has the problem.

Looking forward to the update.

Ove


Guest
28 Apr 2014 11:29

Hi again

I still struggle to import. My server has safemode on, and there is nothing I can do about that. Is that was is causing the following error: Warning: set_time_limit() [function.set-time-limit]: Cannot set time limit in safe mode in /var/www/.../components/com_cobalt/controllers/import.php on line 61?

I still get the execution time exceeded error, but assume it's because of the previous error...

Am I screwed completely because of my servers safemode policy?

Ove


Sergey
Total posts: 13,748
28 Apr 2014 15:01

Guest Am I screwed completely because of my servers safemode policy?

Yes and no. Yes because you are limited and no because it adds to security.


Guest
28 Apr 2014 20:15

Thanks for replying again!

So does this mean that I have no chance of doing the CSV import? Or is there some way around it for either me or on your side? Surely I can't be the only one with safemode on, but haven't seen any posts in this regard on the forum.

Any ideas?

Ove


Sergey
Total posts: 13,748
29 Apr 2014 03:31

I do not know how to bypass this problem.


Sackgesicht VIP
Total posts: 1,636
29 Apr 2014 08:21

even without a time_limit, the import (at least when i checked it a while ago) has other issues. As far as i remember, importing more than 2K records were not possible on machines with 16GB RAM.

But maybe it was addressed already.


Guest
30 Apr 2014 11:30

If that's the case, then isn't that a major problem? I mean, there must be many users who needs to import larger amounts of data.

I don't know much about the import process, but wouldn't it be possible on the programs side to make it read, say 1000 lines of the csv file, import that, finish the proces and resume for the next 1000? This way, maybe, you don't run into the excecution time problem nor the "too much data" problem.

In case there is no solution to this, I assume I can somehow import data directly into the website's database (I don't know anything about this part). Can I then get Cobalt to understand where I want it, i.e. types, fields etc.?

Ove


Sergey
Total posts: 13,748
01 May 2014 10:35

Guest In case there is no solution to this, I assume I can somehow import data directly into the website's database (I don't know anything about this part). Can I then get Cobalt to understand where I want it, i.e. types, fields etc.?

It is possible. This is what import in fact does. Just put data to tables acording to DB schema. But to explain how Cobalt stores artilcles would take too long.


Guest
01 May 2014 23:27

Ok, might have found a workaround that'll work for me. Setting up an ODBC link between my database and _js_res_records and _js_res_records_values in mysql (I am not using categories). It seems pretty straight forward to work directly in the db and with the sections and types set up, this should work, right? There will be an issue with the images I think, but if I do a cobalt import with a tiny bit of the data that has the image links, then the images will be stored correctly. I hope...

This is my last idea to get this to work. Tried copying the website down locally, importing and then putting it back up. That didn't really work and will be a nuisance to do over and over.

Hopefully this works, and maybe it'll help others in the same situation.

Ove


Sergey
Total posts: 13,748
02 May 2014 03:37
  1. Please pay attantion on #__js_res_record_values table. This table stores values of the fields for filtering. Make sure you save values there if you need this.
  2. Look column fields in #__js_res_record. It contain array of vield values for display. The first level key is a field ID.
Powered by Cobalt