Showing posts with label example. Show all posts
Showing posts with label example. Show all posts

Friday, March 30, 2012

Internal Activation Best Practices?

I am looking for an example of a SP that shows the best practices for internal activation? In BOL this topic describes the typical patter for reading messages from a queue. What is the typical pattern for reading messages from a queue using an internally activated SP? Do we still need to loop (considering the message arrival actually fired the sp)?

Any advice provided would be helpful.

Thanks!

Yes, you should still loop. If you don't loop, you risk getting at maximum only one activated procedure, instead of the value set for max_queue_readers. If a procedure is launched and it exits imedeatly (expecting to be launched again), the activation algorithm timers will be reset by this event, so the activation will never reach the point when it has to launch a new instance of the procedure.

The Activation algorithm is described here: http://msdn2.microsoft.com/en-us/library/ms171601.aspx

Another important reason why you should loop is the need to batch commit. If your procedure RECEIVEs one message, process it and commits, you won't be able to scale. Even the best tuned system will only be able to process only tens of messages per second, maybe a best a few hundred. You have to batch several tens, even hundreds, of messages in one transaction, to be able to process thousands of messages per second. The trick here is to avoid holding up a transaction in wait for a message. That is, if you have a batch pending, do a plain RECEIVE, not a WAITFOR(RECEIVE). If the RECEIVE returns empty, commit the pending batch and only then do the WAITFOR(RECEIVE).

Do linger few seconds in your loop before exiting. That is, have a WAITFOR (RECEIVE), TIMEOUT 1000 (or 2000) to drive the loop. This way, the activated procedure will be already started if a new message arrives quickly after you just drained the queue. Don't linger to much, as you're keeping a server thread occupied.

Don't fake an empty queue. That is, don't have a RECEIVE ... WHERE that will return empty rowset even when there are messages in the queue. This will fool the activation algorithm to think that you drained the queue, and will prevent new instances of the procedure to be launched.

Activated procedures have a different execution context than user connection launched procedures. They are under an EXECUTE AS USER context, thus under all the restrictions described here: http://msdn2.microsoft.com/en-us/library/ms188304.aspx. So try to design the procedure avoiding the need to access resources outside the database. If you must, then you'll have to either turn on the trustworthy bit on the database, or folllow these complex steps: http://blogs.msdn.com/remusrusanu/archive/2006/03/07/545508.aspx

Do make sure your procedure handles the system message types:
http://schemas.microsoft.com/SQL/ServiceBroker/Error
http://schemas.microsoft.com/SQL/ServiceBroker/EndDialog
http://schemas.microsoft.com/SQL/ServiceBroker/DialogTimer

Don't rollback intentionally in your procedure. It will trigger the poison message detection and disable your queue.

Do experiment with various ways of writing the RECEIVE loop, e.g.:
- TOP (1) into T-SQL variables
- INTO @.tableVariable and open cursor over @.tableVariable
- CLR procedure and RECEIVE a whole resultset into a SqlDataReader
See which one gives the bets performance for your environment. Some are better when you're likely to have always one message returned, some are better when you're likely to have many messages returned by RECEIVE. Consider that RECEIVE returnes only messages on one conversation_group.

HTH,
~ Remus

|||

I has been brought to my attention that one of my comments is wrong:

If a procedure is launched and it exits imedeatly (expecting to be launched again), the activation algorithm timers will be reset by this event, so the activation will never reach the point when it has to launch a new instance of the procedure.

This is incorrect, the exiting the procedure does not reset the timer and the activation algorithm works as expected in these conditions: the max_queue_readers is reached.

HTH,
~Remus

|||Great info Remus!

Internal Activation Best Practices?

I am looking for an example of a SP that shows the best practices for internal activation? In BOL this topic describes the typical patter for reading messages from a queue. What is the typical pattern for reading messages from a queue using an internally activated SP? Do we still need to loop (considering the message arrival actually fired the sp)?

Any advice provided would be helpful.

Thanks!

Yes, you should still loop. If you don't loop, you risk getting at maximum only one activated procedure, instead of the value set for max_queue_readers. If a procedure is launched and it exits imedeatly (expecting to be launched again), the activation algorithm timers will be reset by this event, so the activation will never reach the point when it has to launch a new instance of the procedure.

The Activation algorithm is described here: http://msdn2.microsoft.com/en-us/library/ms171601.aspx

Another important reason why you should loop is the need to batch commit. If your procedure RECEIVEs one message, process it and commits, you won't be able to scale. Even the best tuned system will only be able to process only tens of messages per second, maybe a best a few hundred. You have to batch several tens, even hundreds, of messages in one transaction, to be able to process thousands of messages per second. The trick here is to avoid holding up a transaction in wait for a message. That is, if you have a batch pending, do a plain RECEIVE, not a WAITFOR(RECEIVE). If the RECEIVE returns empty, commit the pending batch and only then do the WAITFOR(RECEIVE).

Do linger few seconds in your loop before exiting. That is, have a WAITFOR (RECEIVE), TIMEOUT 1000 (or 2000) to drive the loop. This way, the activated procedure will be already started if a new message arrives quickly after you just drained the queue. Don't linger to much, as you're keeping a server thread occupied.

Don't fake an empty queue. That is, don't have a RECEIVE ... WHERE that will return empty rowset even when there are messages in the queue. This will fool the activation algorithm to think that you drained the queue, and will prevent new instances of the procedure to be launched.

Activated procedures have a different execution context than user connection launched procedures. They are under an EXECUTE AS USER context, thus under all the restrictions described here: http://msdn2.microsoft.com/en-us/library/ms188304.aspx. So try to design the procedure avoiding the need to access resources outside the database. If you must, then you'll have to either turn on the trustworthy bit on the database, or folllow these complex steps: http://blogs.msdn.com/remusrusanu/archive/2006/03/07/545508.aspx

Do make sure your procedure handles the system message types:
http://schemas.microsoft.com/SQL/ServiceBroker/Error
http://schemas.microsoft.com/SQL/ServiceBroker/EndDialog
http://schemas.microsoft.com/SQL/ServiceBroker/DialogTimer

Don't rollback intentionally in your procedure. It will trigger the poison message detection and disable your queue.

Do experiment with various ways of writing the RECEIVE loop, e.g.:
- TOP (1) into T-SQL variables
- INTO @.tableVariable and open cursor over @.tableVariable
- CLR procedure and RECEIVE a whole resultset into a SqlDataReader
See which one gives the bets performance for your environment. Some are better when you're likely to have always one message returned, some are better when you're likely to have many messages returned by RECEIVE. Consider that RECEIVE returnes only messages on one conversation_group.

HTH,
~ Remus

|||

I has been brought to my attention that one of my comments is wrong:

If a procedure is launched and it exits imedeatly (expecting to be launched again), the activation algorithm timers will be reset by this event, so the activation will never reach the point when it has to launch a new instance of the procedure.

This is incorrect, the exiting the procedure does not reset the timer and the activation algorithm works as expected in these conditions: the max_queue_readers is reached.

HTH,
~Remus

|||Great info Remus!

Monday, March 26, 2012

Intermediate table relations

Is there a way to find what tables are needed to form a relationship between two tables?

For example, I have an tblssue, tblElementType, and 22 other tables. There isn't a direct relationship between tbIssue and tblElementType but through tblIssue -> tblA -> tbl... -> tblZ -> tblElementType there could be a relation. Is there a way to found out what tblA -> tbl... -> tblZ are?

Create a database diagram in Visio. It'll show the FK relationships.

Adamus

Friday, March 9, 2012

Intellisense in Views corrupts SQL Syntax

When selecting "Design mode" from "Views", the syntax I enter is
automatically changed on-the-fly and corrupts the syntax.
For example from pubs:
SELECT TOP 100 PERCENT *
FROM dbo.authors a INNER JOIN
dbo.publishers p ON a.city = p.city
ORDER BY a.au_lname DESC
Problems:
1. I can't remove "TOP 100 PERCENT" which is valid SQL.
2. "a INNER JOIN" should be "AS a INNER JOIN"
So, how can I code valid SQL without the code being changed?
Sparky
and the phrase "a INNER JOIN" should be:
"AS a INNER JOIN"which would be valid
EM has a lot of strange things in the query builder. I recommend developing your views in Query
Analyzer. As for TOP, you can't have ORDER BY without TOP in a view. A view is supposed to behave as
a table and a table is not ordered. A view allow ORDER BY if you have TOP as this will influence
which rows are returned, the ordering is possibly still not guaranteed. You might want to look at
this:
http://www.aspfaq.com/show.asp?id=2455
Tibor Karaszi, SQL Server MVP
http://www.karaszi.com/sqlserver/default.asp
http://www.solidqualitylearning.com/
"Sparky" <sparky@.boing.com> wrote in message
news:7F7B6BCF-CD0C-405B-8A11-7AC6E268E80F@.microsoft.com...
> When selecting "Design mode" from "Views", the syntax I enter is
> automatically changed on-the-fly and corrupts the syntax.
> For example from pubs:
> SELECT TOP 100 PERCENT *
> FROM dbo.authors a INNER JOIN
> dbo.publishers p ON a.city = p.city
> ORDER BY a.au_lname DESC
> Problems:
> 1. I can't remove "TOP 100 PERCENT" which is valid SQL.
> 2. "a INNER JOIN" should be "AS a INNER JOIN"
> So, how can I code valid SQL without the code being changed?
> Sparky
> and the phrase "a INNER JOIN" should be:
> "AS a INNER JOIN"which would be valid
|||The problem arises when I use Visual Studio 2003's Server Explorer tool to
explore a selected "View" from the sqlserver's treeview. I was not using EM
or QA to edit the view.
Tibor: Thnaks sooo much for the reply - saw your web site -- Great!
Sparky
================================================
"Sparky" wrote:

> When selecting "Design mode" from "Views", the syntax I enter is
> automatically changed on-the-fly and corrupts the syntax.
> For example from pubs:
> SELECT TOP 100 PERCENT *
> FROM dbo.authors a INNER JOIN
> dbo.publishers p ON a.city = p.city
> ORDER BY a.au_lname DESC
> Problems:
> 1. I can't remove "TOP 100 PERCENT" which is valid SQL.
> 2. "a INNER JOIN" should be "AS a INNER JOIN"
> So, how can I code valid SQL without the code being changed?
> Sparky
> and the phrase "a INNER JOIN" should be:
> "AS a INNER JOIN"which would be valid

Integrity with multiple commands

How can I make sure that a couple of commands are either all executed on the database or none of them. For example right now I have an insert, update and delete command. I'm calling each of them with a SqlCommand. So I am afraid that that one of them might be executed, then there's a bad connection and the other two are not. How can I prevent this so that only all commands or nothing is executed on the database?

Write them in a stored procedure~Wink

Stored procedures are a precompiled collection of SQL statements and optional control-of-flow statements stored under a name and processed as a unit.

In short, it's something atomic, a lot of database use some mechanism such as rolling back when a sp is terminated unexpectedly, so either none or all of your command in a sp will be executed~

|||

A stored procedure has absolutely nothing to do with it. You want to place your code within a transaction. You can do this manaully by specifying a BEGIN TRANSACTION (your statements here) and then either ROLLBACK TRANSACTION or COMMIT TRANSACTION, *OR* you can use the sqltransaction object to put multiple sqlcommands within the same transaction.

You can of course put the whole thing in a stored procedure as well for convience, but putting things in a sp won't guarantee they will either all commit or rollback.

This would be an example of what you can place within the commandtext of the sqlcommand property:

dim cmd as new sqlcommand("SET @.RetVal=0; BEGIN TRANSACTION; INSERT INTO Table1(col1) VALUES (@.col1); IF @.@.ERROR=0 BEGIN INSERT INTO Table2(idcol,col2) VALUES (SCOPE_IDENTITY(),@.col2) IF @.@.ERROR=0 BEGIN COMMIT TRANSACTION SET @.RetVal=1 END END IF @.RetVal=0 ROLLBACK TRANSACTION")

cmd.Parameters.Add("@.col1",varchar).value=something

cmd.Parameters.Add("@.col2",varchar).value=something

cmd.Parameters.Add("@.retval",int).Direction=output

cmd.executenonquery()

if cmd.Parameters("@.retval").Value=0 then

' It failed

end if

You can also do it this way:

dim cmd as new sqlcommand("SET @.RetVal=0; SET XACT_ABORT ON; BEGIN TRANSACTION; INSERT INTO Table1(col1) VALUES (@.col1); INSERT INTO Table2(idcol,col2) VALUES (SCOPE_IDENTITY(),@.col2); COMMIT TRANSACTION; SET @.RetVal=1")

That should also work as well.

Friday, February 24, 2012

Integration Services Designer (Visual Studio) very very slow

Hi all

I have some performance issues when developing Integration Service Package ...

It is very very slow for example when I try to add new connection in
Connection Manages, it takes about 2 minutes just to open the property window.

On another pc in the same environment it works ok.
In the past I have doing a lot of SSIS Packages ... Is there any cache to empty ?

Thanks for any comments

Best regards
Frank Uray

Have you tried the 'work offline' option in SSIS menu. That option will prevent SSIS to validate connections in the package at design time. I am not sure if that is the source of your problem; bt you can try it.|||

Thank you for your answer,
but this does not help, it is still very slow ... :-(

Best regards
Frank Uray

|||Um, well... What are your machine specifications? CPU? RAM? etc...

What's the version number of SSIS?|||

Hi Phil

Here are the specifications:
- Dual Core AMD Opteron 2.20 GHz.
- 3.50 GB RAM
- 160 GB Disk
- HP xw9300 Workstation
- Windows XP SP2
- Visual Studio 8.0.50727.42 (RTM.050727-4200)

Hope you can find out something ...

Thanks and best regards
Frank Uray

|||

That's weird. I have half of that hardware and same VS version and run with no problem. Do you get the same behavior even when you create a new project/package? Check the task manager activity as well.

|||

Hi Rafael

Yes, I can create a new project and when I try to
add a new connection it takes 2 minutes to open
the first dialogbox ...
TaskManager is very quiet ...

Also when I run a package on my pc, it takes about
this 2 minutes to begin running the package.
When I run it on the server, the package finishes in 30 seconds.

Thanks and best regards
Frank Uray

|||This could also just be caused by network configurations when trying to do the connection discovery.|||


1.For integration service designer You should also install SQL 2005 SP1 and

the SQL2005 hotfix 2153.

2.another tip I found in the newsgroups was :

In internet explorer go to Tools-> options-> advanced

Uncheck the tickbox "check for publisher's certificate revocation"

(tnx to Nico Verheire)

3.When the designer is extremely slow in complex data flows, you have to break up the package in severaldataflows and use raw files to transfer data from one data flow to the other. See article <http://www.sqljunkies.com/WebLog/reckless/archive/2006/05/01/ssis_largedataflows.aspx>

I hope this can help.

Met vriendelijke groeten - Best Regards - Cordialement - Mit Freundlichen

Gruessen

Jan D'Hondt

Jade bvba

www.jadesoft.be

Integration Services Designer (Visual Studio) is very very slow

Hi all

I have some performance issues when developing Integration Service Package ...

It is very very slow for example when I try to add new connection in
Connection Manages, it takes about 2 minutes just to open the property window.

On another pc in the same environment (Domain etc.) it works ok.
In the past I have doing a lot of SSIS Packages ... Is there any cache to empty ?

Thanks for any comments

Best regards
Frank Uray

Do you have the SSIS service running? That can sometimes speed things up because it caches information about tasks and components.

Other than that - its more likely this can be attributed to hardware or network issues. I suppose it could be a million and one things.

-Jamie

Sunday, February 19, 2012

integrating AS with RS

http://msdn.microsoft.com/library/default.asp?url=/library/en-us/dnsql2k/html/olapasandrs.asp
I dont want to sound dumb, but the first example that they have in this
report says :
by the time you've finished reading this whitepaper, you should be able to
discern which of thes expressions with work as the source of a matrix; and
which will not.
The only problem i see so far; is that these MDX expressions are _EXACTLY_
the same
am i missing something?
you can do better than this, Microsoft :)
Aaronaaron_kempf@.hotmail.com wrote:
> The only problem i see so far; is that these MDX expressions are
> _EXACTLY_
> the same
> am i missing something?
LOL..Aaron
I've read this paper yesterday too......and i ask myself the same like you
regards
Frank
--
www.xax.de

Integrated SQL across large geographical areas.

Is it possible to integrate SQL Server 2005 across a large geographical area?

For Example from England to India?

I don't uderstand your question. Is this a SSIS related issue? if not please use the appropiate forum.