How to set DSN to not ha ve lock records

maretix

Member
Good morning to all of you.

I have this problem.
I need to pull data from a OpenEdge DB version 10.2.b...using ODBC driver and SSIS of SQL SERVER 2008 R2 ..
No problem if there are no locks.
If there is a record lock , the package fails with error.

My db dsn is setted with Default Isolation Level READ UNCOMMITED and i use WITH NO-LOCK option.
I tried also with READ COMMITTED and without WITH NO-LOCK option, nothing to do....

The problem is driver of 10.2b. or there is a way to solve that ???

Regards in advance.
 

Chris Hughes

ProgressTalk.com Sponsor
Hi

Are you trying to update records or not?

If you are just reporting on data stick with read uncommitted, and you shouldn't need with no-lock either.

Not sure what you mean by "nothing to do"?

The problem isn't the driver I've used it for years now.
 

maretix

Member
Hi Chris.. thanks for your kind reply ...
I only need to read records to populate a Sql server table ...
It is a customers table with 5.000.000 records...
I use Openedge 10.2b and i think it is not a driver problem.
When i read records in the night , no concurrency there is,,and so no problems wth locks and errors of lock,
Sometimes i need to read records during working days , cause i need to run a SSIS package during the day..
In this case there are a lot of users can update some records i need.
In this case i encounter the problem of lock.
Progress told me to set DSN with READ UNCOMMITED and not use WITH NOLOCK option...
They told me in this way , i could not have lock problems...but nothing, i have the same..

What do you suggest ???
I never use WITH READPAST ...but i think with this i could have wrong dataset...

Regards in adavance
 

Chris Hughes

ProgressTalk.com Sponsor
I agree with Progress there is no need for NOLOCK with read uncommitted.

You just don't lock records with read uncommitted set, unless you have code changing the isolation level?

I don't specifically understand what your task is here but pulling 5,000,000 records out of a database during peak usage sounds a little overkill - and if you are running 32bit Progress on the server this will be trashing your cache and probably causing the whole system to run slow.

I worked at a place that did something like this but only ever copied changed records over, and for data like telephone numbers and addresses they were copied over during evening hours.

An alternative you may want to look at is a product called Pro2 - it is a replication tool which would give you near realtime copy of your Progress data into a SQL database. There is a webinar on this tomorrow https://community.progress.com/c/e/293.aspx

Cheers
 

TomBascom

Curmudgeon
You might try posting actual error messages.

If you'd like to go "the extra mile" you could even show the code that produces the errors.
 

maretix

Member
You might try posting actual error messages.

If you'd like to go "the extra mile" you could even show the code that produces the errors.

Thanks Mr.Bascom.

I read data from an Openedge DB 10.2B version,
I use 32 bit ODBC driver, because Visual Studio is a 32BIT application.
My code inside a SSIS package of SQL SERVER 2008 R2.
DSN Isolation Level READ UNCOMMITED as told Progress Help Desk and i do nout use WITH NOLOCK hint,,,
I read every day all customers to put them in a DW Sql server Table ...(like Others tables ..it is not only that table).
With no users activity no problem, with users working, problems with violation key ,,,,but it is not true,,,
Every day i run also DBTOOL utility on all tables interested in DW ... to prevent any lenght fields problems...
Openedge Progess 10.2b is on a AIX UNIX Server and ERP is an Italian ERP for retail companies...
In the past i used to pull data form text files...and no problems of this kind...
I am not able to find a solution,,,,i will go back to use text files...

Thanks in advance...

Sql code :

===========================================

select '1' AS codazienda ,
gpdc.codpdc ,
gpdc.ragsoc ,
gpdc.indir,
gpdc.cap,
gpdc.local,
gpdc.stato,
jclasger.codice AS CodClaCli ,
jresp3.oper AS CodRespComm
, CASE ysetcli.codsett WHEN '' THEN 'D' ELSE ysetcli.codsett END AS Codsett
,agenti.agente,
gpdc.potencli,gpdc.class0,gpdc.class1,gpdc.class2,gpdc.class3,gpdc.class4,gpdc.class5,gpdc.class6,gpdc.class7, gpdc.class8,gpdc.class9, gpdc.abcfatt,gpdc.abcfreq,gpdc.numdip,gpdc.budget,gpdc.rischio,gpdc.numins,gpdc.grucli,gpdc.fido,gpdc.condpag,gpdc.codpdcgru,gpdc.codpdccol,gpdc.codgru,gpdc.prov
, gpdc.datchiu ,gpdc.dataper,gpdc.telefono,gpdc.email,
CASE gpdc.flaspe WHEN 1 THEN 'S' ELSE 'N' END as Flaspe,
CASE gpdc.flaspetr WHEN 1 THEN 'S' ELSE 'N' END as Flaspetr
,CASE gpdc.flabol WHEN 1 THEN 'S' ELSE 'N' END as Flabol,gpdc.partiva,gpdc.codfisc,gpdc.codlis,gpdc.lisrif, CASE gpdc.statocom WHEN '' THEN 'XXXXXXXXXX' ELSE gpdc.statocom END AS Statocom ,
gpdc.cellu, gpdc.rischio AS Rischio,
CASE gpdc.rischio WHEN 0 THEN 'Entro Fido' WHEN 1 THEN 'Fuori Fido' WHEN 2 THEN 'Fuori Fido Bloccante' WHEN 3 THEN 'Blocco Insoluto' END AS DescrizioneRischio,
gpdc.numins AS NumeroInsoluti,gpdc.insatt AS InsolutiAttivi,
gpdc.inspag AS InsolutiPagati,gpdc.datchiu AS DataChiusura,
CASE gpdc.addcos WHEN 1 THEN 'S' ELSE 'N' END AS AddebitoCavo,gpdc."pa-ipa" AS CodiceIpa,
CASE gpdc."pa-ftelet" WHEN 1 THEN 'S' ELSE 'N' END AS FlaFattPa

From pub.gpdc

LEFT OUTER JOIN pub.jclasger
ON gpdc.codpdc = jclasger.codrif
AND jclasger.tipo = 'ANCLIE'
AND jclasger.codclass = '01'

LEFT OUTER JOIN pub.jgercod
ON jclasger.codice = jgercod.codice
AND jgercod.tipo = 'ANCLIE'
AND jgercod.codclass = '01'

LEFT OUTER JOIN pub.jresp3
ON gpdc.codpdc = jresp3.codice
AND jresp3.tipo = 'RC'

LEFT OUTER JOIN pub.ysetcli
ON gpdc.codsett = ysetcli.codsett

LEFT OUTER JOIN pub.agenti
ON gpdc.agente = agenti.agente

where gpdc.tipmas = '3'

====================================

SSIS package is composed of four steps:

Step 1 Truncate SQL SERVER DW Table.
Step 2 ADO.NET source editor that run SQL above... (Connection with DSN ..)
Step 3 Data Conversion for UNICODE conversion
Step 4 Ole DB Destination that maps all fields read and move them in a SQL SERVER DW Table.
 

maretix

Member
I agree with Progress there is no need for NOLOCK with read uncommitted.

You just don't lock records with read uncommitted set, unless you have code changing the isolation level?

I don't specifically understand what your task is here but pulling 5,000,000 records out of a database during peak usage sounds a little overkill - and if you are running 32bit Progress on the server this will be trashing your cache and probably causing the whole system to run slow.

I worked at a place that did something like this but only ever copied changed records over, and for data like telephone numbers and addresses they were copied over during evening hours.

An alternative you may want to look at is a product called Pro2 - it is a replication tool which would give you near realtime copy of your Progress data into a SQL database. There is a webinar on this tomorrow https://community.progress.com/c/e/293.aspx

Cheers

Thanks Chris for your suggestions....
I will investigate better and about Openedge Pro2 ...
Thanks again ...
 

TomBascom

Curmudgeon
I still do not see any actual error message.

It is very difficult to help debug errors that are so vague that they cannot even be posted. Surely there is an error message that occurs from time to time?

However -- since SSIS is involved... have you tried running these queries as "pass through" queries?

It is good that you run dbtool regularly. Have you ever run UPDATE STATISTICS?
 

maretix

Member
Sorry i forgot to post error message i encountered..

=======================

Errore Pacchetto: Imp_Clienti

Codice: -1071636471

Descrizione: SSIS Error Code DTS_E_OLEDBERROR. An OLE DB error has occurred. Error code: 0x80004005.
An OLE DB record is available. Source: "Microsoft SQL Server Native Client 10.0" Hresult: 0x80004005 Description: "The statement has been terminated.".
An OLE DB record is available. Source: "Microsoft SQL Server Native Client 10.0" Hresult: 0x80004005 Description: "Violation of PRIMARY KEY constraint 'PK_TBLClienti'. Cannot insert duplicate key in object 'dbo.TBLClienti'. The duplicate key value is (1, 1315591065).".
==============================

I receives Violation primary key WHEN i try to execute write but it is not true ...
When task run without errors ... records are unique ,,,
Destination table has a composed key that is unique....
 

maretix

Member
Sorry i forgot to post error message i encountered..

=======================

Errore Pacchetto: Imp_Clienti

Codice: -1071636471

Descrizione: SSIS Error Code DTS_E_OLEDBERROR. An OLE DB error has occurred. Error code: 0x80004005.
An OLE DB record is available. Source: "Microsoft SQL Server Native Client 10.0" Hresult: 0x80004005 Description: "The statement has been terminated.".
An OLE DB record is available. Source: "Microsoft SQL Server Native Client 10.0" Hresult: 0x80004005 Description: "Violation of PRIMARY KEY constraint 'PK_TBLClienti'. Cannot insert duplicate key in object 'dbo.TBLClienti'. The duplicate key value is (1, 1315591065).".
==============================

I receives Violation primary key WHEN i try to execute write but it is not true ...
When task run without errors ... records are unique ,,,
Destination table has a composed key that is unique....

I add also that if i check the record ...that record is really was touched or is in touching by an user the same day ..
For this it is really very strange message...
 

TomBascom

Curmudgeon
If you are writing data then you need locks.

You said this was a read-only query. Why are you writing data? What part of the query above writes data? (I don't see anything above that writes data -- but maybe I'm blind this morning.)

Also -- that error is coming from SQL Server. Not Progress. So I would have to think that you should be looking to SQL Server for the source of the problem.
 

maretix

Member
Thanks Mr.Bascom.
I read record from OpenEdge and i write in a table Sql Server...
I put other logs in SSIS and you are right...it is a Sql server problem...
Sorry for disturb...
Thanks...extra logs helped me to understand where is error'origin..
I do not know why ... but i hope to find out..
Thanks again.
 

TomBascom

Curmudgeon
I'm happy to have helped!

And I am very pleased to see that posting the actual error message was the key.

I often find that simply talking to someone about a problem forces me to clearly define the problem and the act of doing so quickly leads to a solution. Or at least saves me from spending lots of time pursuing the wrong problem.
 

TheMadDBA

Active Member
Error messages help... who would have thought :)

I would suggest investigating the columns you have defined for the PK on the SQL side and see how they compare to unique indexes on the Progress DB. When you use dirty (uncommitted) reads you can get some pretty strange results depending on which index was used and how the application populates/changes certain columns (like reading the same records twice).

You should really look into Pro2SQL. It can simplify your job greatly since it will only send over new/changed/deleted records and you won't have to read the entire DB every night. It will also support near real time feeds and some basic transformations.
 

maretix

Member
I'm happy to have helped!

And I am very pleased to see that posting the actual error message was the key.

I often find that simply talking to someone about a problem forces me to clearly define the problem and the act of doing so quickly leads to a solution. Or at least saves me from spending lots of time pursuing the wrong problem.
You are right Tom...thanks a lot.
You and other good Progress Specialist ...are my unique bible to solve any strange problems.
All times i disturb you with a post...you Always reply ...that's good...

Regards.
 

maretix

Member
Error messages help... who would have thought :)

I would suggest investigating the columns you have defined for the PK on the SQL side and see how they compare to unique indexes on the Progress DB. When you use dirty (uncommitted) reads you can get some pretty strange results depending on which index was used and how the application populates/changes certain columns (like reading the same records twice).

You should really look into Pro2SQL. It can simplify your job greatly since it will only send over new/changed/deleted records and you won't have to read the entire DB every night. It will also support near real time feeds and some basic transformations.

Thanks The Mad Dba.
I will investigate on the error...
How can i know real index used ???
I am reading via ODBC with query Sql posted ... can i specify an index using SQL (never done..)
I think my problem during working days is near what are you explaining to me...

I am interested also to understand if PRO2 product can fit to my goal...
Regards.
 

TheMadDBA

Active Member
You would have to get an explain plan for your query... an example is here http://knowledgebase.progress.com/articles/Article/20007
Keep in mind that just like SQL Server those plans can change as statistics change.

I think you will still have much more luck using something like PRO2 since it will take care of a lot of these issues The SQL access for Progress is getting better but it still essentially a bolt on to a database that was never designed for SQL and dirty reads cause issues even with native ABL/4GL (without proper programming).
 

maretix

Member
You would have to get an explain plan for your query... an example is here http://knowledgebase.progress.com/articles/Article/20007
Keep in mind that just like SQL Server those plans can change as statistics change.

I think you will still have much more luck using something like PRO2 since it will take care of a lot of these issues The SQL access for Progress is getting better but it still essentially a bolt on to a database that was never designed for SQL and dirty reads cause issues even with native ABL/4GL (without proper programming).

Thank a lot for your suggestions.
I will see how much is it this PRO2 product ..i am not IT manger so i could buy everything even if it should be very useful...
Regards.
 

maretix

Member
Thank a lot for your suggestions.
I will see how much is it this PRO2 product ..i am not IT manger so i could buy everything even if it should be very useful...
Regards.

You would have to get an explain plan for your query... an example is here http://knowledgebase.progress.com/articles/Article/20007
Keep in mind that just like SQL Server those plans can change as statistics change.

I think you will still have much more luck using something like PRO2 since it will take care of a lot of these issues The SQL access for Progress is getting better but it still essentially a bolt on to a database that was never designed for SQL and dirty reads cause issues even with native ABL/4GL (without proper programming).

I tried to investigate what best index to use.
I add WITH INDEX clause in SQL query to specify best index ...
I ran during user activity query 5 times,,,never blocked with violation primay ley error,....
Will see ...thanks again to all of you...

................
.................
From pub.gpdc
WITH (INDEX (tipmas))
.................
................
................
 
Top