Unable to import customers using sql

Hi Guys,

I hope i’m posting this in the correct category, this is no a problem with Samba it is more php issue.

I followed instructions following product import php and used a modified version from John to import customers instead.

  1. Exported existing sample customer database using sql
  2. Added couple of customers
  3. Placed the customer file in c:\wamp\www
  4. Edited your php file to reflect my connection string
  5. Edited line 62 to reflect my native client

I’m using PHP 5.6.40

Any ideas why?

Regards,

tells you error, you have fewer coljmns in the insert section than values…

Your insert statement includes the ‘Id’ column. ‘Id’ is an identity column that auto-increments. Unless that column, ‘Id’, is specifically referenced by something you need, your insert statement should exclude it:

INSERT INTO Entities (EntityTypeID, LastUpdateTime, SearchString, CustomData, Notes, AccountId, WarehouseId, Name)...

Also, you have ‘0’ for LastUpdateTime. THis column cannot be null and must be a datetime. YOu’ll have to format the current datetime to yyyy-MM-dd HH:mm:ss.fff for the insert statement.

1 Like

Is there a reason you need to import customers via PHP?

You can import customers as a CSV right within SambaPOS using Batch Create Entities. I can see you were given details of this before.

I have added customers using the Batch Create Entities many times without problem. The issue you got there is most likely due to the data you tried to import, but I note you did not share an example.

I recommend you use the Batch Create Entities feature and do not pursue using this PHP method as it is is unnecessary.

1 Like

Hi Mark,

I have tried importing the batch update but it kept on failing giving me errors. I have 5000 customers in database, and it was failing randomly and couldn’t find the reason why. Someone else recommended to use php.

Here is a sample of my customer import file.

importVIP - Copy.zip (393 Bytes)

At a guess its maybe timing out, similar to clear transactions on a large database.

Not sure what you mean by timing is out.

not timing is out, timing out

due to the size your trying to import, 5000 records the system may not be able to upload all in one go so it times out and gives you the error

Try installing in smaller groups of records, try 100 first to make sure the upload works fine, then 500 or 1000 more and see if they work until you have uploaded them all

if the 1st 100 still fail then theres probably something not right in the format your trying to upload

Hi Rickh,

I only have a sing record in my spreadsheet as attached. Once that is successful, I’ll do it in maybe batches of 1000. Any idea why I’m having this issue?

Regards

5000 might just be too much to upload at once

No I’m not doing 5000 at once, i’m testing with couple records then do 500 or 1000 at a time.

as he said if 1 fails then likely issue with the format…
Sure i said on your other post to show your csv format your pasting into inport.

I’m sure i’m doing something wrong. Just not sure what it is.

I have attached a copy of the php, customer csv and error message i’m getting. Hope that helps.

(EDIT: removed due to containing real customer data)

What are you pasting in bulk import when it doesnt work…

Hi Memo,

I have tried excluding “id” out but still have the same issue.

The batch create is best method. You did something wrong we can help you. We need to see the mistake though.

Looking at your upload (Customer Import.zip), I see a problem on how the csv is processed. Because JSON uses a comma as the delimiter it’s causing a problem reading the file.

[{"Name":"Street", "Value:"1 Queen St"}...

The above, and data that follows, encompasses the CustomData column.

So, there’s an issue with the ‘E’ part of ‘ETL’ (extract, transform, load). The data needs to be properly formatted.

If you’re able to specify the delimiter in your export, use ‘~’ (tilde). If you can then provide a sample data set, I should be able to make something work for you.

Thanks for your help, but not sure what you mean by that

Delimiter is what is used to separate data. CSV uses a comma for example.

Firstly, looking at your importVIP.csv it uses a tab delimiter and not comma.

Your file shows: Id EntityTypeId LastUpdateTime SearchString CustomData Notes AccountId WarehouseId Name
it should be Id,EntityTypeId,LastUpdateTime,SearchString,CustomData,Notes,AccountId,WarehouseId,Name

(I have omitted the “500” at the start of your sample data line as it relates to the auto-increment column)

The other issue arise from the JSON array. With a comma delimiter
1,2020-07-10 18:36:40.637,NULL,[{"Name":"Street","Value":"1 Any St"},{"Name":"Suburb","Value":"St Marys"},{"Name":"Phone","Value":"021234567"},{"Name":"Postcode","Value":"1111"}],NULL,0,0,John Smith

is read as these columns:

1
2020-07-10 18:36:40.637
NULL
[{"Name":"Street"
"Value":"1 Any St"}
{"Name":"Suburb"
"Value":"St Marys"}
{"Name":"Phone"
"Value":"021234567"}
{"Name":"Postcode"
"Value":"111"}]
NULL
0
0
John Smith

15 columns - I think here is where your error came from earlier.

When you export, if you can change the delimiter from ‘,’ to ‘~’ this will yield the following:

1~2020-07-10 18:36:40.637~NULL~[{"Name":"Street","Value":"1 Any St"},{"Name":"Suburb","Value":"St Marys"},{"Name":"Phone","Value":"021234567"},{"Name":"Postcode","Value":"1111"}]~NULL~0~0~John Smith

When you load the CSV in PHP you should be able to specify the delimiter. If you specify ‘~’ you’ll be left with this:

1
2020-07-10 18:36:40.637
NULL
[{"Name":"Street","Value":"1 Any St"},{"Name":"Suburb","Value":"St Marys"},{"Name":"Phone","Value":"021234567"},{"Name":"Postcode","Value":"1111"}]
NULL
0
0
John Smith

now this corresponds to 8 columns (EntityTypeID, LastUpdateTime, SearchString, CustomData, Notes, AccountID, WarehouseID, Name)

1 Like