Friday, February 24, 2012

Integration services(data flow error)

when executing my data flow package that contains only one source and one destination

OLE db source -> SQL server destination

the following errors occurs in my output

Error: 0xC0202009 at Data Flow Task(infraction action), SQL Server Destination [3600]: An OLE DB error has occurred. Error code: 0x80040E14.

Error: 0xC0202071 at Data Flow Task(infraction action), SQL Server Destination [3600]: Unable to prepare the SSIS bulk insert for data insertion.

Error: 0xC004701A at Data Flow Task(infraction action), DTS.Pipeline: component "SQL Server Destination" (3600) failed the pre-execute phase and returned error code 0xC0202071.


i've checked the structure of my source and destination table but nothing seems to be wrong

if someone have ever faced these errors help me :D

Are you running the package on the same machine as the SQL Server instance that you are inserting into?

If not, it won't work. Use OLE DB Destination instead.

-Jamie

|||I have the same problem and everything is on the same machine. I am transfering from records from multiple tables in one database to a single table in another database. The destinations all have table lock OFF and it usually works. However, recently, I started getting this message. Also, if I rerun it, it happens on different tasks (the tasks are simultaneous 2 at a time).|||

Hi,

I'm having this issue as well, i.e. a package using the "SQL Server Destination" data flow component, loading a server on the same machine (a named instance) that the SSIS package is executing fails with an "Unable to prepare the SSIS bulk insert for data insertion." error. I suspect the windows account executing the package or the SQL login it's mapped to don't have some required permission, but I've tried making the sql account dbo, bulkadmin etc, and added to the local administrators group with no luck. I've seen this reported several times, but no solution?

Dave

|||

Hi guys,

I've mentioned some pre-requisites to getting SQL Server Destination working here:

Destination Adapter Comparison
(http://blogs.conchango.com/jamiethomson/archive/2006/08/14/SSIS_3A00_-Destination-Adapter-Comparison.aspx)

which may or may not be useful. I hope it is.

-Jamie

Integration Services won't install

I am installing Visual Studio 2005 Professional on Windows XP.

I have finished installing Visual Studio 2005 Professional along with Service Pack 1. Now I am installing the Developer version of SQL 2005 that comes with Visual Studio 2005 Professional .

During the installation of SQL Server 2005, I got the following error during
the integration services installation portion:

Failed to install and configure assemblies: c:\program files\microsoft sql
server\90\dts\tasks\microsoft.sqlserver.msmqtask.d ll in the COM+ catalog.
Error -2146233087

Error message: unknown error 0x80131501
Error description: You must have administrative credentials to perform this
task.

I'm logged into Windows as Administrator.

This machine did previously have VS 2005 Standard and SQL 2005 Standard loaded. The software was RTM software as provided by Microsoft at their launch event. I did uninstall both products before attempting the upgrade.

Any help would be much appreciated.

Hi,

Refer below links for workaround

http://forums.microsoft.com/MSDN/ShowPost.aspx?PostID=136467&SiteID=1

http://geekswithblogs.net/waterbaby/archive/2006/08/03/87048.aspx

HTH

Hemantgiri S. Goswami

|||

This is an issue when Microsoft.SqlServer.MSMQTask is left behind in COM+ catalog. You can workaround it by:

1. Open Administrative Tools\Component Services

2. Go to Component Services\Computers\My Computer\COM+ Applications

3. Delete Microsoft.SqlServer.MSMQTask

Integration Services Samples

Hi All,

I have installed SQL 2005, but there is not a,

Integration Services Samples.

The SQL 2005 manuals says:

"By default, the Integration Services samples are installed onto the local hard disk drive in the following folder: C:\Program Files\Microsoft SQL Server\90\Samples"

But there is not "Samples" dir inder C:\Program Files\Microsoft SQL Server\90\

Where I could get the samples ?

Sergiy

You may not have selected the Samples during the install. Quite sensibly they are not there by default.

So you could re-reun setup and under Custom/Advanced select samples, or download the most up to date samples. With SQL 2005 the Docs and Samples team have been releasing updates which is rather nice. The latest can be found as part of the overall SP2 release (http://www.microsoft.com/sql/sp2.mspx), and the direct link to he samples - http://www.microsoft.com/downloads/details.aspx?familyid=e719ecf7-9f46-4312-af89-6ad8702e4e6e&displaylang=en

Integration Services on a Multi-Instance SQL Server Cluster (Best Practice)

Hi
I have a 2-node active-passive SQL cluster (with SSIS manually
clustered in the SQL resource group). I want to convert to active-
active by installing a named instance on the second node. Am I correct
in saying that the new virtual server cannot have its own SSIS
instance but must instead utilise the SSIS instance (& the MSDB
database) of the first (original) virtual server (because there can
only be one instance of SSIS on a node) ?
Having said this, should I move the SSIS resources into (preferably) a
dedicated SSIS resource group or (alternatively) the cluster/quorum
group, moving the xml config file and the Packages folder to the
associated physical disk for that group ?
tia
brynjon
SSIS is typically set up as a local instance for each cluster node. Note
that each node must be installed independently. You can cluster the
resulting SSIS installations and refer to them a a cluster resource.
Here is Kirk's blog entry on how to do that:
http://sqljunkies.com/WebLog/knight_reign/archive/2005/07/06/16015.aspx
Geoff N. Hiten
Senior Database Administrator
Microsoft SQL Server MVP
"brynjon" <bryn.jones@.cheshire.gov.uk> wrote in message
news:1178014798.452341.88160@.p77g2000hsh.googlegro ups.com...
> Hi
> I have a 2-node active-passive SQL cluster (with SSIS manually
> clustered in the SQL resource group). I want to convert to active-
> active by installing a named instance on the second node. Am I correct
> in saying that the new virtual server cannot have its own SSIS
> instance but must instead utilise the SSIS instance (& the MSDB
> database) of the first (original) virtual server (because there can
> only be one instance of SSIS on a node) ?
> Having said this, should I move the SSIS resources into (preferably) a
> dedicated SSIS resource group or (alternatively) the cluster/quorum
> group, moving the xml config file and the Packages folder to the
> associated physical disk for that group ?
> tia
> brynjon
>

Integration services notification services

Does anybody know of a notification services using SSIS? Is it SendMail or otherwise? Is there a step-by-step practice on how to create one?Not sure I understand your question? Are you looking to trigger an SSIS package from Notification Services, or are you want to send a notification from the SSIS package?|||I'm hoping it's the later one - send the notification from the SSIS package.|||You can send an email by using the Send Mail Task in SSIS. Is that not what you are looking for?|||

That would possibly the same as DTS! Do you know of any step--by-step article on how to do this in SSIS?

Just out of curiosity, if I were to use SQL Server's Notification services, I'd be make use of SSRS(reporting services) as well, wouldn't I?

|||

If you search the forum for "Send Mail", you should find some links. Also, it's really not a very complicated task to use.

I'm not sure why you are tying Notifiation Services to Reporting Services (or Integration Services, for that matter). They are seperate products, and while they can interact with each other, they do it the same way any external application would interact. For example, SSIS can interact with Reporting Services directly by calling the web service to generate reports (just like any other external application).

|||The reason I mentioned reporting services because I read at a MSDN article that Analysis services uses Repoorting server for Notification Services.|||

onamika wrote:

The reason I mentioned reporting services because I read at a MSDN article that Analysis services uses Repoorting server for Notification Services.

Neither of which are related to SSIS though.

From SSIS look at using the Send Mail task, or using a Web Service to handle notifications.|||Frankly, I don't see any SendMail tasks example or how create an example. Maybe it's because I'm faily new to MSDN.|||

http://forums.microsoft.com/MSDN/Search/Search.aspx?words=Send+Mail&localechoice=9&SiteID=1&searchscope=forumscope&ForumID=80

The second result returned has a walkthrough of using a Send Mail Task inside a For..Each loop. I didn't go through the rest, but I'm sure there are more examples.

|||

onamika wrote:

Frankly, I don't see any SendMail tasks example or how create an example. Maybe it's because I'm faily new to MSDN.

BOL has some information on the Send Mail task inside SSIS.

http://msdn2.microsoft.com/en-us/library/ms142165.aspx|||

onamika wrote:

The reason I mentioned reporting services because I read at a MSDN article that Analysis services uses Repoorting server for Notification Services.

Perhaps the nomenclature is a problem here. Perhaps it should be "notification services" rather than (the product) "Notification Services".

I'd like to read that MSDN article because if that is what it says then it is lying. Analysis Services does not automatically use Reporting Services for anything. You can deliver reports on top of Analysis Services using Reporting Services - but that is something else entirely.

-Jamie

|||

onamika wrote:

Frankly, I don't see any SendMail tasks example or how create an example.

Its very very very very simple. Just try it - witha little perseverance you won't need any documentaiton.

onamika wrote:

Maybe it's because I'm faily new to MSDN.

Not sure what you mean by this. What has MSDN got to do with this? How can you be "new to MDSN"?

-Jamie

|||

I'd like to stick to the point - question about "SendMail" please!

Ok, bonus question for you. This is mostly text based mail I see. How can I make this email HTML based?

|||

onamika wrote:

I'd like to stick to the point - question about "SendMail" please!

Ok, bonus question for you. This is mostly text based mail I see. How can I make this email HTML based?

Drop the Send Mail task in the Control Flow tool box onto the work surface. Double click on it to configure it. From there, I think you should be able to figure out what to put where.

As far as HTML e-mail (WHY? WHY? WHY?), please search this forum for examples. This has been discussed before.

As an aside - HTML is VERY BAD design, in my opinion. HTML should be left to Web servers to display. If I had it my way (and many in the security community) e-mails would be limited to TEXT ONLY. Anyway......... That's my rant for today.

Integration Services missing from Management Studio

I have installed SQL Server 2005. I am able to create and edit
Integration services packages from VS but the Management Studio does
not show a folder for Integration Services so I can manage the
packages from within the Management Studio itself. From what I can
tell from the documentation, I should be able to see and manage them
from within Management Studio. What am I missing?
(markthompson23@.gmail.com) writes:
> I have installed SQL Server 2005. I am able to create and edit
> Integration services packages from VS but the Management Studio does
> not show a folder for Integration Services so I can manage the
> packages from within the Management Studio itself. From what I can
> tell from the documentation, I should be able to see and manage them
> from within Management Studio. What am I missing?
To work with SSIS you need to use Business Intelligence Development
Studio, that is VS. There is not much you can do with SSIS from Mgmt
Studio.
Erland Sommarskog, SQL Server MVP, esquel@.sommarskog.se
Books Online for SQL Server 2005 at
http://www.microsoft.com/technet/prodtechnol/sql/2005/downloads/books.mspx
Books Online for SQL Server 2000 at
http://www.microsoft.com/sql/prodinfo/previousversions/books.mspx

Integration Services Login

How do I add myself into the Integration Services? so I can use SQL Server
Management Studio to log in Integration Services
First I tried loggin into the SQL Server 2005 itself using 'sa', add my NT
login as a login ie 'SomeGroup\SomeloginName', open the properties for the N
T
Login name & assign all roles including sysadmin to the name.
Then, go back to SQL Server Management Studio to connect to Integration
Services, it still say access deny
Thanks
--
MVP888Hi,
From your descriptions, I understood you are not able to connect SSIS with
the account even you have assigned SA privilege. If I have misunderstood
your concern, please feel free to point it out.
First of all, please use SQL Server Configuration Manager to make sure you
have started SQL Server Intergration Services.
Secondly, have you tried use both Windows Authentication and SQL
Authentication to login the SSIS?
Thirdly, is it possible for you to show us some screen shot?
Sincerely yours,
Michael Cheng
Microsoft Online Partner Support
When responding to posts, please "Reply to Group" via your newsreader so
that others may learn and benefit from your issue.
========================================
=============
This posting is provided "AS IS" with no warranties, and confers no rights.|||Thanks Michael
1. SQL Server Integration Services started because I can login using the NT
account that was used to install the SQL Server but not other NT accounts.
2. I can not switch over to use the SQL Authentication option when I log
into Integration Services because it is grayed out. However, I do have the
SQL Authentication option when I log into the SQL Server Database itself.
(Why is that).
3.I have problem pasting the screen, but here is what on the error screen
when I try to log into SSIS
Cannot connect to ldqcedcmcache1.--> my machine name
Additional Information:
-->Failed to retrieve data for this request. (Microsoft.SqlServer.SmoEnum)
--> Connect to SSIS Service on machine “ldqcedcmcache1” failed:
Access is denied.|||Hi,
[vbcol=seagreen]
the[vbcol=seagreen]
itself.[vbcol=seagreen]
Only Microsoft Windows Authentication is available for SSIS. Windows
Authentication mode allows a user to connect through a Windows user account.
For more detailed information, check the BOL topic below
Connect to Server (Integration Services)
http://msdn2.microsoft.com/en-us/library/ms183366(en-US,SQL.90).aspx
[vbcol=seagreen]
Since the SSIS will only allow Windows Authentication, please make sure
your Windows Account has sufficient privilege to SSIS.
Sincerely yours,
Michael Cheng
Microsoft Online Partner Support
When responding to posts, please "Reply to Group" via your newsreader so
that others may learn and benefit from your issue.
========================================
=============
This posting is provided "AS IS" with no warranties, and confers no rights.|||Hi,
I've just fixed this problem myself. This probably isn't the proper fix,
but I found that adding the user to the local machine's (ie the SQL Server's
local group) administrators group gave me the rights to log on.
This is only a quick fix, but the issue must be that whatever user you want
to access integrations services must need certain levels of access to the
OS, most likely the ability to start and stop services.
Its a starting point anyway!
Regards
Brett
"MVP888" wrote:
[vbcol=seagreen]
> Thanks Michael
> 1. SQL Server Integration Services started because I can login using the N
T
> account that was used to install the SQL Server but not other NT accounts.
> 2. I can not switch over to use the SQL Authentication option when I log
> into Integration Services because it is grayed out. However, I do have th
e
> SQL Authentication option when I log into the SQL Server Database itself.
> (Why is that).
> 3.I have problem pasting the screen, but here is what on the error screen
> when I try to log into SSIS
> Cannot connect to ldqcedcmcache1.--> my machine name
> Additional Information:
> -->Failed to retrieve data for this request. (Microsoft.SqlServer.SmoEnum)
> --> Connect to SSIS Service on machine “ldqcedcmcache1” failed:
> Access is denied.
> .
> --> Connect to SSIS Service on machine “ldqcedcmcache1” fail
ed:
> Access is denied.
> .
> Thanks Again.
> --
> MVP888
>
> "Michael Cheng [MSFT]" wrote:
>|||I was pleased to find this subject posted on the technet board, but somewhat
dissappointed in the resolution. I too am getting the exact error as posted
by the originator. I just CANNOT get connected to an installation of
Integration Services other than the one running on my machine (Developer
Edition).
Where do I specify that a Domain Group has access to the Integration
Services? I can go to the Database Engine and give the group rights on the
server, but that only applies to the Database Engine, not the Integration
Services.
Do I need to do like Brett Hargreaves pointed out and add the Domain Group
to the local server's administrator group. I like to think there is a better
solution that that.
**********************************[vbcol
=seagreen]
> Since the SSIS will only allow Windows Authentication, please make sure
> your Windows Account has sufficient privilege to SSIS.[/vbcol]
***********************************
Where do I go to make that happen?
Thank you in advance.
ToddC
"Michael Cheng [MSFT]" wrote:

> Hi,
>
> the
> itself.
> Only Microsoft Windows Authentication is available for SSIS. Windows
> Authentication mode allows a user to connect through a Windows user accoun
t.
> For more detailed information, check the BOL topic below
> Connect to Server (Integration Services)
> http://msdn2.microsoft.com/en-us/library/ms183366(en-US,SQL.90).aspx
>
> Since the SSIS will only allow Windows Authentication, please make sure
> your Windows Account has sufficient privilege to SSIS.
>
> Sincerely yours,
> Michael Cheng
> Microsoft Online Partner Support
> When responding to posts, please "Reply to Group" via your newsreader so
> that others may learn and benefit from your issue.
> ========================================
=============
> This posting is provided "AS IS" with no warranties, and confers no rights
.
>|||I've installed SQL2005 on my desktop, and have no problem connecting to "dat
abase engine" for databases, jobs etc, but when I try to connect to Integrat
ion Services from my desktop, I get the error:
Connect to SSIS Service on machine "SQL_TestServer" failed:
Access is denied.
I'm a member of sysadmin within SQL, as well as an administrator on the serv
er. I don't have any problem when I log on to the server directly via Remot
e Desktop and open IS. Am I missing something obvious ?

Integration Services log report pack - provided by Microsoft - US date format issue

Hello all,
Microsoft provides a report pack of RS reports to use with
Integration
Services standard logging.
The link is here
http://www.microsoft.com/downloads/details.aspx?FamilyID=526e1fce-7ad5-4a54-b62c-13ffcd114a73&DisplayLang=en#QuickInfoContainer
I have been trying to use these reports in the UK and have come
across
a problem with the date formatting. The reports seem to expect the
date formatting in the report parameters to be in US format. I have
changed all the report language settings to UK - English and the date
formatting in the datasets that supply the parameter values all use
ANSI dates (format 102), but I still get date conversion errors when
trying to return the datasets unless I input the dates in US format.
Any advice on how to get around this issue would be greatly
appreciated.
ThanksOn 26 Mar, 14:31, "weelin" <wee...@.gmail.com> wrote:
> Hello all,
> Microsoft provides a report pack of RS reports to use with
> Integration
> Services standard logging.
> The link is herehttp://www.microsoft.com/downloads/details.aspx?FamilyID=526e1fce-7ad...
> I have been trying to use these reports in the UK and have come
> across
> a problem with the date formatting. The reports seem to expect the
> date formatting in the report parameters to be in US format. I have
> changed all the report language settings to UK - English and the date
> formatting in the datasets that supply the parameter values all use
> ANSI dates (format 102), but I still get date conversion errors when
> trying to return the datasets unless I input the dates in US format.
> Any advice on how to get around this issue would be greatly
> appreciated.
> Thanks
As a follow-up, I have found that the only way to run the report is
via Internat Explorer browser AFTER setting the locale of the browser
to US-English so that it passes the date parameters in the correct
format. Obviously, this is not a practical solution, so if anyone has
found a proper solution, please let me know.
Thanks

Integration Services Install only

Can Integration Services only be installed on a server without installing any
other componenets of the SQL Server 2005 ?. If so, do I need a separate
license for it ?
I have SQL Server installed on a Server and I am trying to execute a IS
package on the SQL Server from a citrix box but returning an error. In
several places I read, they resolve it by installing the IS on the client
(Citrix server in my case).
T.I.AHi
Look at http://msdn2.microsoft.com/en-us/library/ms144259.aspx or Books
Online and install SQL 2005 from the command line using the ADDLOCAL option
with the SQL_DTS child feature.
John
"DXC" wrote:
> Can Integration Services only be installed on a server without installing any
> other componenets of the SQL Server 2005 ?. If so, do I need a separate
> license for it ?
> I have SQL Server installed on a Server and I am trying to execute a IS
> package on the SQL Server from a citrix box but returning an error. In
> several places I read, they resolve it by installing the IS on the client
> (Citrix server in my case).
> T.I.A|||Thanks John. How about the licensing issue ?
"John Bell" wrote:
> Hi
> Look at http://msdn2.microsoft.com/en-us/library/ms144259.aspx or Books
> Online and install SQL 2005 from the command line using the ADDLOCAL option
> with the SQL_DTS child feature.
> John
> "DXC" wrote:
> > Can Integration Services only be installed on a server without installing any
> > other componenets of the SQL Server 2005 ?. If so, do I need a separate
> > license for it ?
> >
> > I have SQL Server installed on a Server and I am trying to execute a IS
> > package on the SQL Server from a citrix box but returning an error. In
> > several places I read, they resolve it by installing the IS on the client
> > (Citrix server in my case).
> >
> > T.I.A|||Hi
You would need to consult your softwaer vendor, but if you install the SQL
Server Engine on this node I expect it will be the normal cost of an active
node.
John
"DXC" wrote:
> Thanks John. How about the licensing issue ?
>
> "John Bell" wrote:
> > Hi
> >
> > Look at http://msdn2.microsoft.com/en-us/library/ms144259.aspx or Books
> > Online and install SQL 2005 from the command line using the ADDLOCAL option
> > with the SQL_DTS child feature.
> >
> > John
> >
> > "DXC" wrote:
> >
> > > Can Integration Services only be installed on a server without installing any
> > > other componenets of the SQL Server 2005 ?. If so, do I need a separate
> > > license for it ?
> > >
> > > I have SQL Server installed on a Server and I am trying to execute a IS
> > > package on the SQL Server from a citrix box but returning an error. In
> > > several places I read, they resolve it by installing the IS on the client
> > > (Citrix server in my case).
> > >
> > > T.I.A|||Thanks............
"John Bell" wrote:
> Hi
> You would need to consult your softwaer vendor, but if you install the SQL
> Server Engine on this node I expect it will be the normal cost of an active
> node.
> John
> "DXC" wrote:
> > Thanks John. How about the licensing issue ?
> >
> >
> > "John Bell" wrote:
> >
> > > Hi
> > >
> > > Look at http://msdn2.microsoft.com/en-us/library/ms144259.aspx or Books
> > > Online and install SQL 2005 from the command line using the ADDLOCAL option
> > > with the SQL_DTS child feature.
> > >
> > > John
> > >
> > > "DXC" wrote:
> > >
> > > > Can Integration Services only be installed on a server without installing any
> > > > other componenets of the SQL Server 2005 ?. If so, do I need a separate
> > > > license for it ?
> > > >
> > > > I have SQL Server installed on a Server and I am trying to execute a IS
> > > > package on the SQL Server from a citrix box but returning an error. In
> > > > several places I read, they resolve it by installing the IS on the client
> > > > (Citrix server in my case).
> > > >
> > > > T.I.A

Integration Services Install only

Can Integration Services only be installed on a server without installing an
y
other componenets of the SQL Server 2005 ?. If so, do I need a separate
license for it ?
I have SQL Server installed on a Server and I am trying to execute a IS
package on the SQL Server from a citrix box but returning an error. In
several places I read, they resolve it by installing the IS on the client
(Citrix server in my case).
T.I.AHi
Look at http://msdn2.microsoft.com/en-us/library/ms144259.aspx or Books
Online and install SQL 2005 from the command line using the ADDLOCAL option
with the SQL_DTS child feature.
John
"DXC" wrote:

> Can Integration Services only be installed on a server without installing
any
> other componenets of the SQL Server 2005 ?. If so, do I need a separate
> license for it ?
> I have SQL Server installed on a Server and I am trying to execute a IS
> package on the SQL Server from a citrix box but returning an error. In
> several places I read, they resolve it by installing the IS on the client
> (Citrix server in my case).
> T.I.A|||Thanks John. How about the licensing issue ?
"John Bell" wrote:
[vbcol=seagreen]
> Hi
> Look at http://msdn2.microsoft.com/en-us/library/ms144259.aspx or Books
> Online and install SQL 2005 from the command line using the ADDLOCAL optio
n
> with the SQL_DTS child feature.
> John
> "DXC" wrote:
>|||Hi
You would need to consult your softwaer vendor, but if you install the SQL
Server Engine on this node I expect it will be the normal cost of an active
node.
John
"DXC" wrote:
[vbcol=seagreen]
> Thanks John. How about the licensing issue ?
>
> "John Bell" wrote:
>|||Thanks............
"John Bell" wrote:
[vbcol=seagreen]
> Hi
> You would need to consult your softwaer vendor, but if you install the SQL
> Server Engine on this node I expect it will be the normal cost of an activ
e
> node.
> John
> "DXC" wrote:
>

Integration Services in SSMS

We use Windows Authentication to connect to SQL Server, is there any special permissions required to connect to Integration Services in SSMS?

Whenever I try to browse the servers available with Integration Services (from Object Browser), none of the servers gets listed. If I directly give server name and try to connect to Integeration Services I get the following error. But I'm able to connect to the database engine.

TITLE: Connect to Server

Cannot connect to <servername>.


ADDITIONAL INFORMATION:

Failed to retrieve data for this request. (Microsoft.SqlServer.SmoEnum)

For help, click: http://go.microsoft.com/fwlink?ProdName=Microsoft+SQL+Server&LinkId=20476

Connect to SSIS Service on machine "<servername>" failed:
Access is denied.

Connect to SSIS Service on machine "<servername>" failed:
Access is denied.

Thanks

Have you searched this forum or the Internet?

A well documented issue exists. Make sure you follow:

http://www.ssistalk.com/2007/04/13/ssis-access-is-denied-when-connection-to-remote-ssis-service/|||I think I went through those threads, forgive me for my ignorance, these steps of DCOM configuration needs to be done on the server where SSIS is running? since the final step says restart Integeration Services, I assume this needs to be done at the server.

Thanks
|||

Karunakaran wrote:

I think I went through those threads, forgive me for my ignorance, these steps of DCOM configuration needs to be done on the server where SSIS is running? since the final step says restart Integeration Services, I assume this needs to be done at the server.

Thanks

Correct. This is all on the server.

Integration Services in BOL: Is this all I get?

Short answer
NO, the beta or CTP Books Online that you're using is not all there's going to be. Make sure that you watch for the Web release of BOL that will drop around the time that SQL Server 2005 hits the streets! Read more to understand why.

More information
SQL Server 2005 is going to be released simultaneously in 8 or 9 languages. Since the documentation team had to lock down early to allow time for translation, the version of Books Online that you'll get on the CD in the box is going to be several months old. Think of the BOL in the box as Version 0.9, while Version 1.0 will be the RTM Web release. Meanwhile we're working non-stop to update, improve, and add to the Integration Services content in Books Online.

Remember that the BOL feedback links are your most direct route to improved content!

-Doug

Doug,
Are we going to be able to sync our local install with the web release?

-Jamie|||Jamie,

If you mean, will you see updated content silently downloaded on a topic-by-topic basis in your existing BOL installation, unfortunately, no. BOL and the Microsoft Help 2.0 technology that we rely on don't currently support that. The Web release around RTM that I mentioned will simply be a complete new BOL installation...and yes, a rather sizeable download.

We will, at least, be flagging content that has changed significantly so that you won't have to scan all the sections that interest you with the hope of coming across some new tidbit of information.

Best regards,

-Doug

Integration services Globla variable and scheduling issue

I have one main package from which i am executing 5 other packages.I want to use one single DB connection in all the packages which i am declaring in Main package and it should be available in all the child packages. How can i Do this?

Few Variables are common is all the packages so i want to keep it Globally so how can i access those variables ?

How can i schedule package in sql server?

Thanks

Priyank Gajera wrote:

I have one main package from which i am executing 5 other packages.I want to use one single DB connection in all the packages which i am declaring in Main package and it should be available in all the child packages. How can i Do this?

You can use package configuration in your children packages based on a parent package variable to set the connection strings.

Priyank Gajera wrote:

Few Variables are common is all the packages so i want to keep it Globally so how can i access those variables ?

You meant, how the children packages can 'see' the value of a variable in the parent package? Foe that use package configuration as well as described in the previos answer.

Priyank Gajera wrote:

How can i schedule package in sql server?

Using SQL Agent; create a job to run your package. There are a lot of information around that in this forum.

Rafael Salas

|||

Parent Package configurations can get a little tricky. The easiest thing to do is set up an environment variable with your connection string, and then configure your packages to use that environment variable.

To configure your packages go to the SSIS-->Configuration menu, and then in the drop down choose Environment Variable. Map that to the connection string of the connection manager you are using.

Sql Agent will schedule your package to run. You can access it through the Sql Management Studio.

|||

Instead of using variable, Can I use the same connection for child packages which I am using in Main package ?

And if there is no other way then how can I asign a variable value(Connection Staring) to a Ole DB connection manager?

|||

Hi Rafael

Can we share Connections( Ole DB or FTP Comnnection) between packages?

Can you give one example of using parent package variable or if not possible, then please describe me the procedure.

Thanks

Priyank Gajera

|||Hi,

Thanks for the answer

I am able to get other variables or values in child package using parent package variable but I don't know how to assign any variable to OLD DB connection manager or FTP Connection manager(I am using both type of connection manager as well as Flat file connection)

So please guide me how to Assign connection string to FTP , OLD DB or Flat file connection managers?|||

Priyank Gajera wrote:

Hi,

Thanks for the answer

I am able to get other variables or values in child package using parent package variable but I don't know how to assign any variable to OLD DB connection manager or FTP Connection manager(I am using both type of connection manager as well as Flat file connection)

So please guide me how to Assign connection string to FTP , OLD DB or Flat file connection managers?

The procedure is pretty much described in Sean's post. Basically you need to place the connection string in some place is accessible for package configurations; this is Env. variable, registry key, xml file, etc. and the set the package configuration in you package to go there and 'grab' the connection string and assigned it to the connection string property of you connection manager.

Rafael Salas

Integration services Globla variable and scheduling issue

I have one main package from which i am executing 5 other packages.I want to use one single DB connection in all the packages which i am declaring in Main package and it should be available in all the child packages. How can i Do this?

Few Variables are common is all the packages so i want to keep it Globally so how can i access those variables ?

How can i schedule package in sql server?

Thanks

Priyank Gajera wrote:

I have one main package from which i am executing 5 other packages.I want to use one single DB connection in all the packages which i am declaring in Main package and it should be available in all the child packages. How can i Do this?

You can use package configuration in your children packages based on a parent package variable to set the connection strings.

Priyank Gajera wrote:

Few Variables are common is all the packages so i want to keep it Globally so how can i access those variables ?

You meant, how the children packages can 'see' the value of a variable in the parent package? Foe that use package configuration as well as described in the previos answer.

Priyank Gajera wrote:

How can i schedule package in sql server?

Using SQL Agent; create a job to run your package. There are a lot of information around that in this forum.

Rafael Salas

|||

Parent Package configurations can get a little tricky. The easiest thing to do is set up an environment variable with your connection string, and then configure your packages to use that environment variable.

To configure your packages go to the SSIS-->Configuration menu, and then in the drop down choose Environment Variable. Map that to the connection string of the connection manager you are using.

Sql Agent will schedule your package to run. You can access it through the Sql Management Studio.

|||

Instead of using variable, Can I use the same connection for child packages which I am using in Main package ?

And if there is no other way then how can I asign a variable value(Connection Staring) to a Ole DB connection manager?

|||

Hi Rafael

Can we share Connections( Ole DB or FTP Comnnection) between packages?

Can you give one example of using parent package variable or if not possible, then please describe me the procedure.

Thanks

Priyank Gajera

|||Hi,

Thanks for the answer

I am able to get other variables or values in child package using parent package variable but I don't know how to assign any variable to OLD DB connection manager or FTP Connection manager(I am using both type of connection manager as well as Flat file connection)

So please guide me how to Assign connection string to FTP , OLD DB or Flat file connection managers?|||

Priyank Gajera wrote:

Hi,

Thanks for the answer

I am able to get other variables or values in child package using parent package variable but I don't know how to assign any variable to OLD DB connection manager or FTP Connection manager(I am using both type of connection manager as well as Flat file connection)

So please guide me how to Assign connection string to FTP , OLD DB or Flat file connection managers?

The procedure is pretty much described in Sean's post. Basically you need to place the connection string in some place is accessible for package configurations; this is Env. variable, registry key, xml file, etc. and the set the package configuration in you package to go there and 'grab' the connection string and assigned it to the connection string property of you connection manager.

Rafael Salas

Integration Services For Each Loop Question

I'm wondering if this can be done (I've had no luck so far):
I would like to loop thru a set of files in a directory, place the file name
into a variable, then insert the file name as a record into a table.
I've gotten the for each (file) loop set up and working fine, placing the
file name into a user variable, but can't figure out how to use the value of
the variable to insert it into a table.
The next step would be to loop thru the records in the table retrieving each
file name into a variable to use in another for each file loop to copy the
file to another location.
The purpose of the entire exercise is to put file names into two tables for
log backups. I'm backing up log files to a local directory, and want to copy
them to a network share. Since I only want to copy newly backed up files, I
would like to go thru all the files on the local drive, write them to a
table, go thru all the previously copied files on the network share, write
the names to another table, join the two tables to find which files do not
exist on the network share and copy them to it.
There may be a better way to do this, but I've not figured it out so far.
Thanks for any assistance with this.
TomTTomT wrote:
> I'm wondering if this can be done (I've had no luck so far):
> I would like to loop thru a set of files in a directory, place the file na
me
> into a variable, then insert the file name as a record into a table.
> I've gotten the for each (file) loop set up and working fine, placing the
> file name into a user variable, but can't figure out how to use the value
of
> the variable to insert it into a table.
> The next step would be to loop thru the records in the table retrieving ea
ch
> file name into a variable to use in another for each file loop to copy the
> file to another location.
> The purpose of the entire exercise is to put file names into two tables fo
r
> log backups. I'm backing up log files to a local directory, and want to co
py
> them to a network share. Since I only want to copy newly backed up files,
I
> would like to go thru all the files on the local drive, write them to a
> table, go thru all the previously copied files on the network share, write
> the names to another table, join the two tables to find which files do not
> exist on the network share and copy them to it.
> There may be a better way to do this, but I've not figured it out so far.
> Thanks for any assistance with this.
> TomT
You're going about this the hard way. If you can run xp_cmdshell, you
can do this with a single insert statement:
CREATE TABLE #Table (
[FileName] VARCHAR(255)
)
DECLARE @.Command VARCHAR(255)
SELECT @.Command = 'master..xp_cmdshell ''DIR /B C:\WINDOWS'''
INSERT INTO #Table
EXEC (@.Command)
SELECT * FROM #Table
DROP TABLE #Table
Tracy McKibben
MCDBA
http://www.realsqlguy.com|||Thanks Tracy, I guess I was hoping to incoporate this into an SSIS package
which would do the backups, history clean up, and file copy. Also, trying to
get up to speed on SSIS capabilities.
"Tracy McKibben" wrote:

> TomT wrote:
> You're going about this the hard way. If you can run xp_cmdshell, you
> can do this with a single insert statement:
> CREATE TABLE #Table (
> [FileName] VARCHAR(255)
> )
> DECLARE @.Command VARCHAR(255)
> SELECT @.Command = 'master..xp_cmdshell ''DIR /B C:\WINDOWS'''
> INSERT INTO #Table
> EXEC (@.Command)
> SELECT * FROM #Table
> DROP TABLE #Table
> --
> Tracy McKibben
> MCDBA
> http://www.realsqlguy.com
>|||TomT wrote:
> Thanks Tracy, I guess I was hoping to incoporate this into an SSIS package
> which would do the backups, history clean up, and file copy. Also, trying
to
> get up to speed on SSIS capabilities.
>
Ahh, ok... Well, this may or may not interest you, I have a script that
will automatically backup any database on your server, including t-logs,
and will handle the history cleanup too...
http://realsqlguy.com/twiki/bin/vie...realsqlguy.com|||Thanks Tracy - that's very helpful, and definately of interest to me.
At the same time, I'm still interested in finding out if there's a way to do
what I'm attempting via SSIS, particularly getting the file loop variable
into a sql insert statement.
I appreciate your feedback and assistance with this issue...
"Tracy McKibben" wrote:

> TomT wrote:
> Ahh, ok... Well, this may or may not interest you, I have a script that
> will automatically backup any database on your server, including t-logs,
> and will handle the history cleanup too...
> http://realsqlguy.com/twiki/bin/vie...realsqlguy.com
>|||Hi,
Thanks for your post!
To make me clear about your issue, I appreciate to know:
1) You could retrieve each filename in a variable now and you didn't know
how to insert it into a table?
If so, you can write SQL like this:
INSERT INTO Table_Name([colname1],...[colnamen])
values(@.FileName1,...,[@.othervalue])
2) Are your purpose as following ?
Retrieve related files on the local drive and insert their names into
one table;
Retrieve related files on the remote network drive and insert their
names into a second table;
Compare the data between the two tables, if the 1st table's files names
are not in the second table, copy them into the network drive.
For such issue, you can't directly use TSQL in workflow control in SSIS.
I recommend you use "Execute SQL task" and map the variable to a
parameter in the task.
Right click task->Edit->Parameter Mapping.
Then you could use the parameter as you mentioned in the SQLstatement in
General tab of "Execute SQL task"
http://msdn2.microsoft.com/en-us/library/ms141003.aspx
Also, you could configure "variable mapping" of foreach loop container
to the variable you uses.
Other tasks like ActiveX Scripts Task can also help you deal with this
case.
http://msdn2.microsoft.com/en-us/library/ms137525.aspx
If you have any other concerns, please feel free to let me know. I'm
happy for your assistance.
+++++++++++++++++++++++++++
Charles Wang
Microsoft Online Partner Support
+++++++++++++++++++++++++++
When responding to posts, please "Reply to Group" via your newsreader so
that others may learn and benefit from your issue.
========================================
=============
Business-Critical Phone Support (BCPS) provides you with technical phone
support at no charge during critical LAN outages or "business down"
situations. This benefit is available 24 hours a day, 7 days a week to all
Microsoft technology partners in the United States and Canada.
This and other support options are available here:
BCPS:
https://partner.microsoft.com/US/te...erview/40010469
Others:
https://partner.microsoft.com/US/te...upportoverview/
If you are outside the United States, please visit our International
Support page:
http://support.microsoft.com/defaul...rnational.aspx.
========================================
=============
This posting is provided "AS IS" with no warranties, and confers no rights.|||TomT - it sounds like everyone has an opinion! Here is a way to do
what you want with SSIS. Then see below that for my opinion .
1 - Open your new SSIS project, and add a Foreach Loop container
2 - In the properties, make it a Foreach File Enumerator, and choose
the folder you want. Set the other properties, such as if you want the
fully qualified filename
3 - In the variable mappings area, Add a new user variable named
"FileName"; it should appear as User::FileName. Click OK & you're done
with the loop container
4 - Now go modify the variable properties (View, Other Windows,
Variables). Highlight the variable created in #3 and press F4 to get
the properties window.
5 - Change "EvaluateAsExpression" to True
6 - Change the "Expression" property to be your SQL Statement, but now
you're including your variable (notice the single quotes around your
variable):
"INSERT test SELECT '" + @.[User::FileName] + "'"
7 - Now add an "Execute SQL Task" to your loop container. Choose your
database. Change the SQL SourceType to be Variable.
8 - Choose your variable, which has now been filled with your SQL
statement.
9 - test out by debugging & then select the data from your table
SELECT * FROM test
Your next step - you mentioned it was to copy the files from the
original location to a new location using the table to loop. You can
skip all of the above madness, by just using the file system task, and
copy the entire contents of directory #1 to directory #2.
---
I'm wondering if this can be done (I've had no luck so far):
I would like to loop thru a set of files in a directory, place the file
name
into a variable, then insert the file name as a record into a table.
I've gotten the for each (file) loop set up and working fine, placing
the
file name into a user variable, but can't figure out how to use the
value of
the variable to insert it into a table.
The next step would be to loop thru the records in the table retrieving
each
file name into a variable to use in another for each file loop to copy
the
file to another location.
The purpose of the entire exercise is to put file names into two tables
for
log backups. I'm backing up log files to a local directory, and want to
copy
them to a network share. Since I only want to copy newly backed up
files, I
would like to go thru all the files on the local drive, write them to a
table, go thru all the previously copied files on the network share,
write
the names to another table, join the two tables to find which files do
not
exist on the network share and copy them to it.
There may be a better way to do this, but I've not figured it out so
far.
Thanks for any assistance with this.
TomT|||Thanks Corey - that's the direction I wanted to go with this (getting the
variable mapped properly in the sql statement.
There are a couple of reasons I'm going about things in this way. Since I'm
doing log backups every hour locally, and then copying these files over to
another system, I want to only copy over the most recent file. The other
(network) system should also have copies of previously copied logs. I wanted
to keep the i/o down, and only copy over the latest backup.
Secondly, I wanted to do this as a way to get more familiar with SSIS. I
think all I need to do really is just get the name of the most recent backup
log file, but I don't know how to get that info via SSIS.
So, e.g., say I've got two days worth of log backups on the local system,
then copy these over (say manually just for now) to a network system. Then,
the next time a log backup takes place on the local system, I want to copy
JUST that one over to the network system. If I could just identify that
particular (most recent) log backup's file name, I'd be set.
Again, this is just as much to learn SSIS as anything else right now.
Thanks for your help
"CoreyB" wrote:

> TomT - it sounds like everyone has an opinion! Here is a way to do
> what you want with SSIS. Then see below that for my opinion .
> 1 - Open your new SSIS project, and add a Foreach Loop container
> 2 - In the properties, make it a Foreach File Enumerator, and choose
> the folder you want. Set the other properties, such as if you want the
> fully qualified filename
> 3 - In the variable mappings area, Add a new user variable named
> "FileName"; it should appear as User::FileName. Click OK & you're done
> with the loop container
> 4 - Now go modify the variable properties (View, Other Windows,
> Variables). Highlight the variable created in #3 and press F4 to get
> the properties window.
> 5 - Change "EvaluateAsExpression" to True
> 6 - Change the "Expression" property to be your SQL Statement, but now
> you're including your variable (notice the single quotes around your
> variable):
> "INSERT test SELECT '" + @.[User::FileName] + "'"
> 7 - Now add an "Execute SQL Task" to your loop container. Choose your
> database. Change the SQL SourceType to be Variable.
> 8 - Choose your variable, which has now been filled with your SQL
> statement.
> 9 - test out by debugging & then select the data from your table
> SELECT * FROM test
>
> Your next step - you mentioned it was to copy the files from the
> original location to a new location using the table to loop. You can
> skip all of the above madness, by just using the file system task, and
> copy the entire contents of directory #1 to directory #2.
>
> ---
> I'm wondering if this can be done (I've had no luck so far):
> I would like to loop thru a set of files in a directory, place the file
> name
> into a variable, then insert the file name as a record into a table.
> I've gotten the for each (file) loop set up and working fine, placing
> the
> file name into a user variable, but can't figure out how to use the
> value of
> the variable to insert it into a table.
> The next step would be to loop thru the records in the table retrieving
> each
> file name into a variable to use in another for each file loop to copy
> the
> file to another location.
> The purpose of the entire exercise is to put file names into two tables
> for
> log backups. I'm backing up log files to a local directory, and want to
> copy
> them to a network share. Since I only want to copy newly backed up
> files, I
> would like to go thru all the files on the local drive, write them to a
> table, go thru all the previously copied files on the network share,
> write
> the names to another table, join the two tables to find which files do
> not
> exist on the network share and copy them to it.
> There may be a better way to do this, but I've not figured it out so
> far.
> Thanks for any assistance with this.
> TomT
>|||Sounds good - good luck!
TomT wrote:[vbcol=seagreen]
> Thanks Corey - that's the direction I wanted to go with this (getting the
> variable mapped properly in the sql statement.
> There are a couple of reasons I'm going about things in this way. Since I'
m
> doing log backups every hour locally, and then copying these files over to
> another system, I want to only copy over the most recent file. The other
> (network) system should also have copies of previously copied logs. I want
ed
> to keep the i/o down, and only copy over the latest backup.
> Secondly, I wanted to do this as a way to get more familiar with SSIS. I
> think all I need to do really is just get the name of the most recent back
up
> log file, but I don't know how to get that info via SSIS.
> So, e.g., say I've got two days worth of log backups on the local system,
> then copy these over (say manually just for now) to a network system. Then
,
> the next time a log backup takes place on the local system, I want to copy
> JUST that one over to the network system. If I could just identify that
> particular (most recent) log backup's file name, I'd be set.
> Again, this is just as much to learn SSIS as anything else right now.
> Thanks for your help
> "CoreyB" wrote:
>

Integration Services For Each Loop Question

I'm wondering if this can be done (I've had no luck so far):
I would like to loop thru a set of files in a directory, place the file name
into a variable, then insert the file name as a record into a table.
I've gotten the for each (file) loop set up and working fine, placing the
file name into a user variable, but can't figure out how to use the value of
the variable to insert it into a table.
The next step would be to loop thru the records in the table retrieving each
file name into a variable to use in another for each file loop to copy the
file to another location.
The purpose of the entire exercise is to put file names into two tables for
log backups. I'm backing up log files to a local directory, and want to copy
them to a network share. Since I only want to copy newly backed up files, I
would like to go thru all the files on the local drive, write them to a
table, go thru all the previously copied files on the network share, write
the names to another table, join the two tables to find which files do not
exist on the network share and copy them to it.
There may be a better way to do this, but I've not figured it out so far.
Thanks for any assistance with this.
TomTTomT wrote:
> I'm wondering if this can be done (I've had no luck so far):
> I would like to loop thru a set of files in a directory, place the file name
> into a variable, then insert the file name as a record into a table.
> I've gotten the for each (file) loop set up and working fine, placing the
> file name into a user variable, but can't figure out how to use the value of
> the variable to insert it into a table.
> The next step would be to loop thru the records in the table retrieving each
> file name into a variable to use in another for each file loop to copy the
> file to another location.
> The purpose of the entire exercise is to put file names into two tables for
> log backups. I'm backing up log files to a local directory, and want to copy
> them to a network share. Since I only want to copy newly backed up files, I
> would like to go thru all the files on the local drive, write them to a
> table, go thru all the previously copied files on the network share, write
> the names to another table, join the two tables to find which files do not
> exist on the network share and copy them to it.
> There may be a better way to do this, but I've not figured it out so far.
> Thanks for any assistance with this.
> TomT
You're going about this the hard way. If you can run xp_cmdshell, you
can do this with a single insert statement:
CREATE TABLE #Table (
[FileName] VARCHAR(255)
)
DECLARE @.Command VARCHAR(255)
SELECT @.Command = 'master..xp_cmdshell ''DIR /B C:\WINDOWS'''
INSERT INTO #Table
EXEC (@.Command)
SELECT * FROM #Table
DROP TABLE #Table
--
Tracy McKibben
MCDBA
http://www.realsqlguy.com|||Thanks Tracy, I guess I was hoping to incoporate this into an SSIS package
which would do the backups, history clean up, and file copy. Also, trying to
get up to speed on SSIS capabilities.
"Tracy McKibben" wrote:
> TomT wrote:
> > I'm wondering if this can be done (I've had no luck so far):
> >
> > I would like to loop thru a set of files in a directory, place the file name
> > into a variable, then insert the file name as a record into a table.
> >
> > I've gotten the for each (file) loop set up and working fine, placing the
> > file name into a user variable, but can't figure out how to use the value of
> > the variable to insert it into a table.
> >
> > The next step would be to loop thru the records in the table retrieving each
> > file name into a variable to use in another for each file loop to copy the
> > file to another location.
> >
> > The purpose of the entire exercise is to put file names into two tables for
> > log backups. I'm backing up log files to a local directory, and want to copy
> > them to a network share. Since I only want to copy newly backed up files, I
> > would like to go thru all the files on the local drive, write them to a
> > table, go thru all the previously copied files on the network share, write
> > the names to another table, join the two tables to find which files do not
> > exist on the network share and copy them to it.
> >
> > There may be a better way to do this, but I've not figured it out so far.
> >
> > Thanks for any assistance with this.
> >
> > TomT
> You're going about this the hard way. If you can run xp_cmdshell, you
> can do this with a single insert statement:
> CREATE TABLE #Table (
> [FileName] VARCHAR(255)
> )
> DECLARE @.Command VARCHAR(255)
> SELECT @.Command = 'master..xp_cmdshell ''DIR /B C:\WINDOWS'''
> INSERT INTO #Table
> EXEC (@.Command)
> SELECT * FROM #Table
> DROP TABLE #Table
> --
> Tracy McKibben
> MCDBA
> http://www.realsqlguy.com
>|||TomT wrote:
> Thanks Tracy, I guess I was hoping to incoporate this into an SSIS package
> which would do the backups, history clean up, and file copy. Also, trying to
> get up to speed on SSIS capabilities.
>
Ahh, ok... Well, this may or may not interest you, I have a script that
will automatically backup any database on your server, including t-logs,
and will handle the history cleanup too...
http://realsqlguy.com/twiki/bin/view/RealSQLGuy/AutomaticBackupOfAllDatabases
Tracy McKibben
MCDBA
http://www.realsqlguy.com|||Thanks Tracy - that's very helpful, and definately of interest to me.
At the same time, I'm still interested in finding out if there's a way to do
what I'm attempting via SSIS, particularly getting the file loop variable
into a sql insert statement.
I appreciate your feedback and assistance with this issue...
"Tracy McKibben" wrote:
> TomT wrote:
> > Thanks Tracy, I guess I was hoping to incoporate this into an SSIS package
> > which would do the backups, history clean up, and file copy. Also, trying to
> > get up to speed on SSIS capabilities.
> >
> Ahh, ok... Well, this may or may not interest you, I have a script that
> will automatically backup any database on your server, including t-logs,
> and will handle the history cleanup too...
> http://realsqlguy.com/twiki/bin/view/RealSQLGuy/AutomaticBackupOfAllDatabases
>
> --
> Tracy McKibben
> MCDBA
> http://www.realsqlguy.com
>|||Hi,
Thanks for your post!
To make me clear about your issue, I appreciate to know:
1) You could retrieve each filename in a variable now and you didn't know
how to insert it into a table?
If so, you can write SQL like this:
INSERT INTO Table_Name([colname1],...[colnamen])
values(@.FileName1,...,[@.othervalue])
2) Are your purpose as following ?
Retrieve related files on the local drive and insert their names into
one table;
Retrieve related files on the remote network drive and insert their
names into a second table;
Compare the data between the two tables, if the 1st table's files names
are not in the second table, copy them into the network drive.
For such issue, you can't directly use TSQL in workflow control in SSIS.
I recommend you use "Execute SQL task" and map the variable to a
parameter in the task.
Right click task->Edit->Parameter Mapping.
Then you could use the parameter as you mentioned in the SQLstatement in
General tab of "Execute SQL task"
http://msdn2.microsoft.com/en-us/library/ms141003.aspx
Also, you could configure "variable mapping" of foreach loop container
to the variable you uses.
Other tasks like ActiveX Scripts Task can also help you deal with this
case.
http://msdn2.microsoft.com/en-us/library/ms137525.aspx
If you have any other concerns, please feel free to let me know. I'm
happy for your assistance.
+++++++++++++++++++++++++++
Charles Wang
Microsoft Online Partner Support
+++++++++++++++++++++++++++
When responding to posts, please "Reply to Group" via your newsreader so
that others may learn and benefit from your issue.
=====================================================Business-Critical Phone Support (BCPS) provides you with technical phone
support at no charge during critical LAN outages or "business down"
situations. This benefit is available 24 hours a day, 7 days a week to all
Microsoft technology partners in the United States and Canada.
This and other support options are available here:
BCPS:
https://partner.microsoft.com/US/technicalsupport/supportoverview/40010469
Others:
https://partner.microsoft.com/US/technicalsupport/supportoverview/
If you are outside the United States, please visit our International
Support page:
http://support.microsoft.com/default.aspx?scid=%2finternational.aspx.
=====================================================
This posting is provided "AS IS" with no warranties, and confers no rights.|||TomT - it sounds like everyone has an opinion! Here is a way to do
what you want with SSIS. Then see below that for my opinion :).
1 - Open your new SSIS project, and add a Foreach Loop container
2 - In the properties, make it a Foreach File Enumerator, and choose
the folder you want. Set the other properties, such as if you want the
fully qualified filename
3 - In the variable mappings area, Add a new user variable named
"FileName"; it should appear as User::FileName. Click OK & you're done
with the loop container
4 - Now go modify the variable properties (View, Other Windows,
Variables). Highlight the variable created in #3 and press F4 to get
the properties window.
5 - Change "EvaluateAsExpression" to True
6 - Change the "Expression" property to be your SQL Statement, but now
you're including your variable (notice the single quotes around your
variable):
"INSERT test SELECT '" + @.[User::FileName] + "'"
7 - Now add an "Execute SQL Task" to your loop container. Choose your
database. Change the SQL SourceType to be Variable.
8 - Choose your variable, which has now been filled with your SQL
statement.
9 - test out by debugging & then select the data from your table
SELECT * FROM test
Your next step - you mentioned it was to copy the files from the
original location to a new location using the table to loop. You can
skip all of the above madness, by just using the file system task, and
copy the entire contents of directory #1 to directory #2.
---
I'm wondering if this can be done (I've had no luck so far):
I would like to loop thru a set of files in a directory, place the file
name
into a variable, then insert the file name as a record into a table.
I've gotten the for each (file) loop set up and working fine, placing
the
file name into a user variable, but can't figure out how to use the
value of
the variable to insert it into a table.
The next step would be to loop thru the records in the table retrieving
each
file name into a variable to use in another for each file loop to copy
the
file to another location.
The purpose of the entire exercise is to put file names into two tables
for
log backups. I'm backing up log files to a local directory, and want to
copy
them to a network share. Since I only want to copy newly backed up
files, I
would like to go thru all the files on the local drive, write them to a
table, go thru all the previously copied files on the network share,
write
the names to another table, join the two tables to find which files do
not
exist on the network share and copy them to it.
There may be a better way to do this, but I've not figured it out so
far.
Thanks for any assistance with this.
TomT|||Thanks Corey - that's the direction I wanted to go with this (getting the
variable mapped properly in the sql statement.
There are a couple of reasons I'm going about things in this way. Since I'm
doing log backups every hour locally, and then copying these files over to
another system, I want to only copy over the most recent file. The other
(network) system should also have copies of previously copied logs. I wanted
to keep the i/o down, and only copy over the latest backup.
Secondly, I wanted to do this as a way to get more familiar with SSIS. I
think all I need to do really is just get the name of the most recent backup
log file, but I don't know how to get that info via SSIS.
So, e.g., say I've got two days worth of log backups on the local system,
then copy these over (say manually just for now) to a network system. Then,
the next time a log backup takes place on the local system, I want to copy
JUST that one over to the network system. If I could just identify that
particular (most recent) log backup's file name, I'd be set.
Again, this is just as much to learn SSIS as anything else right now.
Thanks for your help
"CoreyB" wrote:
> TomT - it sounds like everyone has an opinion! Here is a way to do
> what you want with SSIS. Then see below that for my opinion :).
> 1 - Open your new SSIS project, and add a Foreach Loop container
> 2 - In the properties, make it a Foreach File Enumerator, and choose
> the folder you want. Set the other properties, such as if you want the
> fully qualified filename
> 3 - In the variable mappings area, Add a new user variable named
> "FileName"; it should appear as User::FileName. Click OK & you're done
> with the loop container
> 4 - Now go modify the variable properties (View, Other Windows,
> Variables). Highlight the variable created in #3 and press F4 to get
> the properties window.
> 5 - Change "EvaluateAsExpression" to True
> 6 - Change the "Expression" property to be your SQL Statement, but now
> you're including your variable (notice the single quotes around your
> variable):
> "INSERT test SELECT '" + @.[User::FileName] + "'"
> 7 - Now add an "Execute SQL Task" to your loop container. Choose your
> database. Change the SQL SourceType to be Variable.
> 8 - Choose your variable, which has now been filled with your SQL
> statement.
> 9 - test out by debugging & then select the data from your table
> SELECT * FROM test
>
> Your next step - you mentioned it was to copy the files from the
> original location to a new location using the table to loop. You can
> skip all of the above madness, by just using the file system task, and
> copy the entire contents of directory #1 to directory #2.
>
> ---
> I'm wondering if this can be done (I've had no luck so far):
> I would like to loop thru a set of files in a directory, place the file
> name
> into a variable, then insert the file name as a record into a table.
> I've gotten the for each (file) loop set up and working fine, placing
> the
> file name into a user variable, but can't figure out how to use the
> value of
> the variable to insert it into a table.
> The next step would be to loop thru the records in the table retrieving
> each
> file name into a variable to use in another for each file loop to copy
> the
> file to another location.
> The purpose of the entire exercise is to put file names into two tables
> for
> log backups. I'm backing up log files to a local directory, and want to
> copy
> them to a network share. Since I only want to copy newly backed up
> files, I
> would like to go thru all the files on the local drive, write them to a
> table, go thru all the previously copied files on the network share,
> write
> the names to another table, join the two tables to find which files do
> not
> exist on the network share and copy them to it.
> There may be a better way to do this, but I've not figured it out so
> far.
> Thanks for any assistance with this.
> TomT
>|||Sounds good - good luck!
TomT wrote:
> Thanks Corey - that's the direction I wanted to go with this (getting the
> variable mapped properly in the sql statement.
> There are a couple of reasons I'm going about things in this way. Since I'm
> doing log backups every hour locally, and then copying these files over to
> another system, I want to only copy over the most recent file. The other
> (network) system should also have copies of previously copied logs. I wanted
> to keep the i/o down, and only copy over the latest backup.
> Secondly, I wanted to do this as a way to get more familiar with SSIS. I
> think all I need to do really is just get the name of the most recent backup
> log file, but I don't know how to get that info via SSIS.
> So, e.g., say I've got two days worth of log backups on the local system,
> then copy these over (say manually just for now) to a network system. Then,
> the next time a log backup takes place on the local system, I want to copy
> JUST that one over to the network system. If I could just identify that
> particular (most recent) log backup's file name, I'd be set.
> Again, this is just as much to learn SSIS as anything else right now.
> Thanks for your help
> "CoreyB" wrote:
> > TomT - it sounds like everyone has an opinion! Here is a way to do
> > what you want with SSIS. Then see below that for my opinion :).
> >
> > 1 - Open your new SSIS project, and add a Foreach Loop container
> > 2 - In the properties, make it a Foreach File Enumerator, and choose
> > the folder you want. Set the other properties, such as if you want the
> > fully qualified filename
> > 3 - In the variable mappings area, Add a new user variable named
> > "FileName"; it should appear as User::FileName. Click OK & you're done
> > with the loop container
> > 4 - Now go modify the variable properties (View, Other Windows,
> > Variables). Highlight the variable created in #3 and press F4 to get
> > the properties window.
> > 5 - Change "EvaluateAsExpression" to True
> > 6 - Change the "Expression" property to be your SQL Statement, but now
> > you're including your variable (notice the single quotes around your
> > variable):
> >
> > "INSERT test SELECT '" + @.[User::FileName] + "'"
> >
> > 7 - Now add an "Execute SQL Task" to your loop container. Choose your
> > database. Change the SQL SourceType to be Variable.
> > 8 - Choose your variable, which has now been filled with your SQL
> > statement.
> > 9 - test out by debugging & then select the data from your table
> >
> > SELECT * FROM test
> >
> >
> > Your next step - you mentioned it was to copy the files from the
> > original location to a new location using the table to loop. You can
> > skip all of the above madness, by just using the file system task, and
> > copy the entire contents of directory #1 to directory #2.
> >
> >
> > ---
> >
> > I'm wondering if this can be done (I've had no luck so far):
> >
> > I would like to loop thru a set of files in a directory, place the file
> > name
> > into a variable, then insert the file name as a record into a table.
> >
> > I've gotten the for each (file) loop set up and working fine, placing
> > the
> > file name into a user variable, but can't figure out how to use the
> > value of
> > the variable to insert it into a table.
> >
> > The next step would be to loop thru the records in the table retrieving
> > each
> > file name into a variable to use in another for each file loop to copy
> > the
> > file to another location.
> >
> > The purpose of the entire exercise is to put file names into two tables
> > for
> > log backups. I'm backing up log files to a local directory, and want to
> > copy
> > them to a network share. Since I only want to copy newly backed up
> > files, I
> > would like to go thru all the files on the local drive, write them to a
> > table, go thru all the previously copied files on the network share,
> > write
> > the names to another table, join the two tables to find which files do
> > not
> > exist on the network share and copy them to it.
> >
> > There may be a better way to do this, but I've not figured it out so
> > far.
> >
> > Thanks for any assistance with this.
> >
> > TomT
> >
> >

Integration Services extraction/loading throughput/performance

I'm new to integration services.
I want to create a centralized reporting system for our customers. Some customers have up to 1,000 sites and some are expected to grow past 5,000 sites. The sites are running POS applications and I want to extract the POS sales data from these sites. Is it practical to expect that SSIS can handle the extraction of data from this many sites and load the data into a central
SQL database? The POS sales data at the sites is stored in SqlExpress databases but the data is also available in XML format.
If it's practical for Integration Services to do this, what frequency is it possible to pull this data?
I realize that the amout of data is relative but just wondering if anyone is attempting to do this with integration services.
If not with integration services, then what method(s) are available and used to extract data from this many remote sites?

SSIS is a very good fit for this type of problem. Loading data from flat files/XML files/databases and inserting to a RDBMS is an every day task for SSIS and something it does very efficiently.

SSIS does not have any restriction on how often you pull the data. This decison depends on many many factors such as:

latency of data|||Thanks for the reply!

Do you know where I may find a case study or know of anyone else extracting data from this many sites? I really do expect some customers to grow to 5,000 sites. So before investing into this strategy I want to see if it's being done.

Thanks!
|||

I don't know of any case studies, no. Fundamentally, SSIS will be able to do this. Whether it will work in your situation is down to the factors that I mentioned, not the lack of functionality.

-Jamie

Integration Services Designer (Visual Studio) very very slow

Hi all

I have some performance issues when developing Integration Service Package ...

It is very very slow for example when I try to add new connection in
Connection Manages, it takes about 2 minutes just to open the property window.

On another pc in the same environment it works ok.
In the past I have doing a lot of SSIS Packages ... Is there any cache to empty ?

Thanks for any comments

Best regards
Frank Uray

Have you tried the 'work offline' option in SSIS menu. That option will prevent SSIS to validate connections in the package at design time. I am not sure if that is the source of your problem; bt you can try it.|||

Thank you for your answer,
but this does not help, it is still very slow ... :-(

Best regards
Frank Uray

|||Um, well... What are your machine specifications? CPU? RAM? etc...

What's the version number of SSIS?|||

Hi Phil

Here are the specifications:
- Dual Core AMD Opteron 2.20 GHz.
- 3.50 GB RAM
- 160 GB Disk
- HP xw9300 Workstation
- Windows XP SP2
- Visual Studio 8.0.50727.42 (RTM.050727-4200)

Hope you can find out something ...

Thanks and best regards
Frank Uray

|||

That's weird. I have half of that hardware and same VS version and run with no problem. Do you get the same behavior even when you create a new project/package? Check the task manager activity as well.

|||

Hi Rafael

Yes, I can create a new project and when I try to
add a new connection it takes 2 minutes to open
the first dialogbox ...
TaskManager is very quiet ...

Also when I run a package on my pc, it takes about
this 2 minutes to begin running the package.
When I run it on the server, the package finishes in 30 seconds.

Thanks and best regards
Frank Uray

|||This could also just be caused by network configurations when trying to do the connection discovery.|||


1.For integration service designer You should also install SQL 2005 SP1 and

the SQL2005 hotfix 2153.

2.another tip I found in the newsgroups was :

In internet explorer go to Tools-> options-> advanced

Uncheck the tickbox "check for publisher's certificate revocation"

(tnx to Nico Verheire)

3.When the designer is extremely slow in complex data flows, you have to break up the package in severaldataflows and use raw files to transfer data from one data flow to the other. See article <http://www.sqljunkies.com/WebLog/reckless/archive/2006/05/01/ssis_largedataflows.aspx>

I hope this can help.

Met vriendelijke groeten - Best Regards - Cordialement - Mit Freundlichen

Gruessen

Jan D'Hondt

Jade bvba

www.jadesoft.be

Integration Services Designer (Visual Studio) is very very slow

Hi all

I have some performance issues when developing Integration Service Package ...

It is very very slow for example when I try to add new connection in
Connection Manages, it takes about 2 minutes just to open the property window.

On another pc in the same environment (Domain etc.) it works ok.
In the past I have doing a lot of SSIS Packages ... Is there any cache to empty ?

Thanks for any comments

Best regards
Frank Uray

Do you have the SSIS service running? That can sometimes speed things up because it caches information about tasks and components.

Other than that - its more likely this can be attributed to hardware or network issues. I suppose it could be a million and one things.

-Jamie

Integration Services Data Types Maximum Length

Hi,

Is there a way in-code to determine the maximum length of a Integration Services Data Type.

I need to determine based on the data type what the maximum length of a column is IN-CODE.

However, the column.Length property only gives me a length for DT_WSTR and DT_STR values. This is the only property that would seem to remotely give me the right answer.

I need to know the maximum lengths in columns for DT_BOOL, DT_CY, DT_I2, DT_I4, DT_I8, DT_NUMERIC, and DT_UI1. I can always hard-code these values into my program, but that makes no sense. There has to be some sort of way to determine what the maximum possible length of these values are.

For numeric values I could use the column.Precision value but that still leaves with with a lot of data types without a maximum length.
It might help if you explain why you want to know this information.

A DT_I4 is 4 bytes long, for instance, but it can hold -2,147,483,648 to 2,147,483,647. So which value do you want? Do you see the problem?

Data types are fixed in what they can handle, so why not just hard code them into your "IN-CODE"?|||In your example, for DT_I4, -2,147,483,648 is a length of 11 (excluding commas, but including the sign). Thats the kind of value I need to know. I apologize if I wasn't clear on that.

Well, yes, I could hard-code them. But I was wondering whether that type of information is actually stored (which really if you think about it, shouldn't be that difficult, and also worthwhile so a person does not have the compute what each one would be).

Hard-coding is something you also try to avoid if there is a way to avoid it (and hence my question as to whether I can avoid this or not).

Thanks

|||But that's just it, the length of a DT_I4 is 4, not 11.

The "length" of 11 is not stored anywhere. It's a limitation of the storage allocated to the data type. There are probably internal bounds checks, but I'm guessing they are not exposed. You could always try one of the many programming forums in MSDN to see if someone has an idea, as this really isn't an SSIS issue. SSIS data types are simply mapped to structures in .Net.

And the max length of a DT_STR is 8000 bytes. DT_WSTR is 4000 bytes. But more importantly, there is no such thing as a DT_STR, DT_I4, etc... in code. So what SSIS has as limitations may not be the same as the structures available in your code. (A DT_I4 equates to the Integer (Int32) structure.)|||

theddern wrote:

Hi,

Is there a way in-code to determine the maximum length of a Integration Services Data Type.

I need to determine based on the data type what the maximum length of a column is IN-CODE.

However, the column.Length property only gives me a length for DT_WSTR and DT_STR values. This is the only property that would seem to remotely give me the right answer.

I need to know the maximum lengths in columns for DT_BOOL, DT_CY, DT_I2, DT_I4, DT_I8, DT_NUMERIC, and DT_UI1. I can always hard-code these values into my program, but that makes no sense. There has to be some sort of way to determine what the maximum possible length of these values are.

For numeric values I could use the column.Precision value but that still leaves with with a lot of data types without a maximum length.

There is no maximum/minimum length for those data types. They are just what they are - they never change. The length simply isn't relevant.

-Jamie

|||

theddern wrote:

In your example, for DT_I4, -2,147,483,648 is a length of 11 (excluding commas, but including the sign).

No its not. The length of it is 4. 4 bytes that is.

If the value that you want is 11 then just cast it as a DT_STR and get the length of that.

-Jamie

|||So what you are saying is that I need to hard-code the lengths if I need to know those particular type of values?

|||

theddern wrote:

So what you are saying is that I need to hard-code the lengths if I need to know those particular type of values?

Yes! I just can't see the value in knowing the "lengths" of data types.|||And to clarify, I need to know what the maximum length (in characters) a particular column can be if that particular data was transfered to a command delimited flat file.

I was trying to figure that out based on IDTSColumn90.DataType because that would be the most logical place to start.
|||Are you wanting the display length for each data type?|||

theddern wrote:

And to clarify, I need to know what the maximum length (in characters) a particular column can be if that particular data was transfered to a command (sic) delimited flat file.

Why? If it is a delimited file, who cares?|||If he wanted to document specs for a fixed-length output file, he would find these useful. |||

theddern wrote:

So what you are saying is that I need to hard-code the lengths if I need to know those particular type of values?

No. I'm saying you don't need to care.

|||

Phil Brammer wrote:

theddern wrote:

And to clarify, I need to know what the maximum length (in characters) a particular column can be if that particular data was transfered to a command (sic) delimited flat file.

Why? If it is a delimited file, who cares?

Very very very good point.

This seems a very strange thing to want to do.

|||Yes

Integration Services Data Types

Hi, I have a question regarding the Integration Services Data Types.

From http://msdn2.microsoft.com/en-us/library/ms141036(d-printer).aspx, I found a table that shows me the Mapping of Integration Services Data Types to Database Data Types.

For example, how the DT_BOOL Data Type maps to bit for SQL Server.

In this case, I am okay, as I know exactly what the mapping is, however, for some of the datatypes, I do not.

Here is an example. The DT_CY datatype maps to smallmoney and money ... how do I know which one to map to? For me, which one I map to does indeed matter because their representation is different.

DT_NUMERIC maps to decimal and numeric ... this one does not matter as much

DT_STR/DT_WSTR ... I need to know whether its char, varchar, ncahr, or nvarchar for padding purposes mostly.

Any help would be gladly appreciated.
As for DT_CY, you pick. Either will work.

DT_STR = varchar, char
DT_WSTR = nvarchar, nchar|||From what I am doing with the values, I can not just pick for DT_CY. I need to know whether it is actually smallmoney, or money.

Same goes with DT_STR and varchar, char ... I need to know whether its one or the other.

And similarly for nvarchar/nchar for DT_WSTR.

I am passing these values to an application that needs to know what is what because it treats each value differently.

|||

theddern wrote:

From what I am doing with the values, I can not just pick for DT_CY. I need to know whether it is actually smallmoney, or money.

Same goes with DT_STR and varchar, char ... I need to know whether its one or the other.

And similarly for nvarchar/nchar for DT_WSTR.

I am passing these values to an application that needs to know what is what because it treats each value differently.

You are on the wrong end of the question though.

YOU have to decide which SQL Server data type best fits the data. SSIS doesn't dictate that; you do.

So yes, you have to pick and stick with it. Do you understand the differences in the SQL Server datatypes? You might not ever use char/nchar (retains trailing spaces) so that might solve that issue for you. DT_CY, well, you just need to know what the data supports and choose the correct one.|||Alright thanks

Integration Services Considerations

We have about 150 SQL servers and basically we're considering the pros and cons of installing SSIS on a central SSIS server - that is responsible for all DTS jobs - as opposed to installing SSIS on the local SQL instance.

On the plus side so far:

1./ Central administration, alerting, change management etc

2./ Possible performance gain on the local instance not having SSIS installed?

On the negative side:

1./ Central point of failure

2./ Possibility that it would need to be a clustered...

3./ Compatibility issues may mean having to make the central SSIS server 32-bit?

4./ Possible performance cost of remote SSIS?

5./ With multiple DTS packages running at different times, when would we take the server down for maintenace...?

Would appreciate your thoughts.

First, you presented us a root whitout leafs, in other terms what are data transforming/changing with these 150 SQL Servers ?

SSIS server is used to run packages and, let's say you have 150 packages to run, can the central server resolve this workload ?

Depends on business logic i should build a SSIS grid with 10-15 nodes that can run the packages and haave many point of failures (not single).

To the other part, moving data from a SQL Servers network (that is homogenous and i guess it don't need data cleaning/transforming) to another can be made using replication or service broker.

I guess you have to build a DataWarehouse that centralize data from 150 SQL Servers. That is made obviously nightly when the people (OLTP applications) sleep so the SSIS operations can't affect performance.