Please start any new threads on our new site at https://forums.sqlteam.com. We've got lots of great SQL Server experts to answer whatever question you can come up with.

 All Forums
 SQL Server 2008 Forums
 Transact-SQL (2008)
 OpenLink ODBC and OPENQUERY

Author  Topic 

JonathanH
Starting Member

8 Posts

Posted - 2010-02-17 : 23:23:15
I am using OpenLink ODBC (via the Microsoft OLE DB for ODBC as per Openlink's website) to connect a LinkedServer to my SQL Express 2008.

The problem I have is that there are 6 tables that need to be linked, and when I open the View that refers to the tables the ODBC kicks back an Error and in order to get access to the LinkedServer again I have to reboot the computer.

The LinkedServer is a Progress 91d server (AECLINK), and the views are opened as such:

View1
SELECT DISTINCT a,b,c FROM
OPENQUERY(AECLINK,'SELECT * FROM tableA') as tbl1

View2
SELECT d,f,e FROM
OPENQUERY(AECLINK,'SELECT * FROM tableB') as tbl2 LEFT OUTER JOIN View1 ON c=f

View3
SELECT q,w,s,z,x,g
OPENQUERY(AECLINK,'SELECT * FROM tableC') as tbl3 LEFT OUTER JOIN View2 ON f=g

View4
SELECT l,a,u,n,d,r,y,l,i,s,t
OPENQUERY(AECLINK,'SELECT l,u,n,r,y,l,i,t FROM tableD LEFT OUTER JOIN tableE ON ... INNER JOIN tableF ON ...') INNER JOIN View3 ON ....

--------------------
Opening View4 crashes the system.

Need more info, let me know

JH

Kristen
Test

22859 Posts

Posted - 2010-02-18 : 03:24:27
We only ever pull data from OPENQUERY to #TEMP tables, and then use a separate query to JOIN them to other, local, tables.

Dunno if that is a solution for you though.
Go to Top of Page

JonathanH
Starting Member

8 Posts

Posted - 2010-02-18 : 15:46:24
Do you think you could give me an example of this? would this work in a View scenario if the temp-table maintained the same name?

Or would it be better to dump the data into a SQL tables, and have those tables rebuilt on a daily basis?

Not exactly "Live" data then, but we can't always have everything we wnt, right.
Go to Top of Page

JonathanH
Starting Member

8 Posts

Posted - 2010-02-18 : 19:26:35
ok, here's the solution that I have used. The client won't be getting live data, but we have agreed to a 24hr bulk update.

I have save sql statements:

Rebuild-tblDeceased.sql
and
Rebuild-tblGraves.sql

one builds the tblDeceased table directly from the external source, then modifies the table to add a Primary Key constraint to the identity column.
The other builds the tblGraves table directly from the external source, then modifies the table to add a Primary Key constraint to the identity column.

The SQL statements work fine, no questions there (though I can post them if necessary).

My question is:
Can I run these as part of a .bat file as a Scheduled Task?

I am looking through the MSDN website, and if I find the answer before one is posted here, I will return and post myself.

Solution one way or another :)
Go to Top of Page

JonathanH
Starting Member

8 Posts

Posted - 2010-02-18 : 19:38:45
Using sqlcmd to run the scripts put the below in a .bat file.

----------------------------------------------------------------------
@echo off
SET SQL=C:\BINN-PATH-FOR-SQL-TOOLS
SET PATH=%PATH%;%PHP%

rem Now regenerate from source database and log the results
sqlcmd -S myServer\instanceName -i C:\DIRECTORY\Rebuild-tblDeceased.sql -o C:\DecAdds.txt
sqlcmd -S myServer\instanceName -i C:\DIRECTORY\Rebuild-tblGraves.sql -o C:\PlotAdds.txt
Go to Top of Page

JonathanH
Starting Member

8 Posts

Posted - 2010-02-18 : 19:45:22
Using osql to run the scripts put the below in a .bat file.

----------------------------------------------------------------------
@echo off
SET SQL=C:\BINN-PATH-FOR-SQL-TOOLS
SET PATH=%PATH%;%PHP%

rem Now regenerate from source database and log the results
osql -E -S myServer\instanceName -i C:\DIRECTORY\Rebuild-tblDeceased.sql -o C:\DecAdds.txt
osql -E -S myServer\instanceName -i C:\DIRECTORY\Rebuild-tblDeceased.sql -o C:\DecAdds.txt
Go to Top of Page

JonathanH
Starting Member

8 Posts

Posted - 2010-02-18 : 19:47:36
Note:
sqlcmd uses the -E (trusted connection) option by default, osql command does not.

JH
Go to Top of Page

Kristen
Test

22859 Posts

Posted - 2010-02-19 : 01:39:15
"Do you think you could give me an example of this?"

SELECT *
INTO #TEMP
FROM OPENQUERY(...

"would this work in a View scenario if the temp-table maintained the same name?"

No, sorry I had missed that requirement.

Might be worth looking at OPENROWSET (instead of OPENQUERY), although I expect it comes to the same thing in the end.

" Or would it be better to dump the data into a SQL tables, and have those tables rebuilt on a daily basis?

Not exactly "Live" data then, but we can't always have everything we wnt
"

We do that with most of the cross-system data we use. In particular where the remote end has a last-updated date/time so we can easily query new/changed records - so we can just pull those across every hour, or so.

Although the data isn't "live" there is usually another debit note, or whatever, waiting on someone's desk to be entered, and although people want "live data" usually that means "an hour ago" or "last night" is good enough IME.

Are you transferring SQL to drop/create table and INSERT statements for all the data?

It would probably be better to just transfer the data (using BCP rather than SQLCMD/OSQL, or SSIS)

What is the remote system that you are connecting to? (using Openlink's ODBC drivers)
Go to Top of Page

JonathanH
Starting Member

8 Posts

Posted - 2010-02-19 : 07:44:25
As stated in the original post (3rd Paragraph) the LinkedServer is a Progress 91d Server (www.progress.com)

The way we have decided to go is this:

Fundamentally static data has been transferred to the SQL Server Database. This is: cemetery specific data, sections of the cemetery, rows of the sections. This stuff may be updated, but not regularly and a manual update of the SQL Server Database can be run when it does.

The daily updates involve records that do change frequently, like the grave plots information (status of a grave can change from day to day), and the deceased/holders information (new holder, holder becomes deceased, etc.)

What is done is:

1 Check if table exists, if so drop it.
2 Run a SELECT <<required fields>> INTO tblBlah <- from OPENQUERY
3 Alter table with identity and PK constraint requirements
4 EXIT

The total running time of the batch job is 7 minutes. Which is ok given the network, ODBC layer, and the fact that that processes 400k rows (200k-ish per table).

Thank you for the explanations and ideas Kristen. Love your work :)

JH
Go to Top of Page

Kristen
Test

22859 Posts

Posted - 2010-02-19 : 10:38:45
Between (1) starting and (2) finishing (or perhaps (3) finishing) you won't have a "complete" system to report again.

That may not be a problem e.g. if import is in the middle of the night, but even if you did that it kinda precludes running it "in emergency" during the day perhaps?

A possible alternative would be

Run a SELECT <<required fields>> INTO #TEMP <- from OPENQUERY

Then:

DELETE D
FROM tblBlah AS D
LEFT OUTER JOIN #TEMP AS S
ON S.MyPK = D.MyPK
WHERE S.MyPK IS NULL -- Record no longer exists

UPDATE D -- Update existing records but only if they have changed
SET Col1 = S.Col1,
Col2 = S.Col2,
...
FROM tblBlah AS D
JOIN #TEMP AS S
ON S.MyPK = D.MyPK
WHERE (D.Col1 <> S.Col1
OR (D.Col1 IS NULL AND S.Col1 IS NOT NULL)
OR (D.Col1 IS NOT NULL AND S.Col1 IS NULL))
OR (D.Col2 <> S.Col2
OR (D.Col2 IS NULL AND S.Col2 IS NOT NULL)
OR (D.Col2 IS NOT NULL AND S.Col2 IS NULL))
...

INSERT INTO tblBlah(Col1, Col2, ...)
FROM #TEMP AS S
LEFT OUTER JOIN tblBlah AS D
ON D.MyPK = S.MyPK
WHERE D.MyPK IS NULL -- New record

this does minimal updating on the Target machine.

If you want to go down this route let me know, there is a bit detail on some of the areas (handing varchar which is case insensitive, Identity columns, ordering the updates so as not to upset any Foreign Keys, etc.) and the code can be mechanically generated which saves a lot of work
Go to Top of Page

JonathanH
Starting Member

8 Posts

Posted - 2010-02-19 : 17:18:43
The way you have outlined is a much better way of doing things, and while I'm sure that the client would be impressed with an hourly update of the data, I'm not sure whether it is worth it.

We shut down the mapping software in the batch prior to running updates anyway, as a precaution (clears any table/row locking that the mapping software enforces), so they will not have access to the mapping while the update takes place.

JH
Go to Top of Page

Kristen
Test

22859 Posts

Posted - 2010-02-20 : 01:44:32
" We shut down the mapping software in the batch prior to running updates anyway"

That's fine then. No point adding complexity if it is not needed - more complexity is always more bother

Go to Top of Page
   

- Advertisement -