I'm upgrading an app I wrote from SQL 2000 to 2005 looking for some
high-level design advice, just to determine how much should be T-SQL vs.
CLR, and the best approach to take for performance. For background, I'm a
strong c# developer, but with only average SQL skills and I'm just getting
into SQL 2005.
My Current (SQL 2000) Setup:
--
My C# app regularly collects performance information, then inserts it into a
table. On insert in that table I have a trigger to kick off a couple stored
procs that compare the new data to other tables where thresholds are set.
As a simple example, it essentially does a "SELECT * FROM Thresholds WHERE
@.NewlyInsertedValue > ThresholdValue" to determine if the current value is
in violation of any thresholds. (There are a couple more columns that are
considered that makes the query a little more intensive, but this is the
general idea)
If thresholds are violated, it inserts a record into an 'alerts' table and
generates an email (by another stored proc). If no thresholds are violated,
it runs a stored proc to clear any alerts that might have been created
previously since we're not in violation anymore. Since this happens EVERY
time a new value is collected, I really want to optimize this as much as
possible.
My Desired (SQL 2005) Setup:
--
I'm going to replace certain obvious functions like the "send email"
function from old SQL objects to net CLR objects. However, I want to
further optimize the system that compares incoming data with thresholds and
generates alerts as necessary. It seems very wasteful to do selects every
time a new value is inserted, and as the tables grow even with proper
indexing it is going to get expensive. Is there a way I can take advantage
of CLR to maybe keep these thresholds in memory to compare them against?
Any other ideas about how to accomplish this most efficiently?
Craig
PS - I can post DDL if you really want it, I was just trying to get high
level design ideas from someone that's done a lot of SQL CLR work already.For general data manipulation especially Inserts, Updates, Deletes and
normal Selects TSQL will always be the best choice. The CLR is useful for
heavy calculations / aggregations or when you simply can't do something with
TSQL. Of coarse anytime you need to visit the filesystem or need the
equivalent of an extended sp the CLR is prime for that as well. AS for the
mail component you can use the new DatabaseMail which works pretty well and
a lot less troublesome than the MAPI version in 2000. But in general I
suggest you use stored procedures for this instead of triggers. You have a
fair amount of processing to do and you should keep triggers as trim as
possible. I don't see any reason this logic can not be put into a standard
TSQL stored procedure. But as for checking the values why not do that in the
front end or middle tier before you even attempt the insert? That kind of
information is pretty much static and can be cached very easily in the
middle tier or front end. If these inserts are many you can save a lot of
processing etc. on the back end by validating the data before you send it
in.
Andrew J. Kelly SQL MVP
"Craig S" <cscheets@.dontspam.kc.rr.com> wrote in message
news:OIG$CfXDGHA.1544@.TK2MSFTNGP10.phx.gbl...
> I'm upgrading an app I wrote from SQL 2000 to 2005 looking for some
> high-level design advice, just to determine how much should be T-SQL vs.
> CLR, and the best approach to take for performance. For background, I'm a
> strong c# developer, but with only average SQL skills and I'm just getting
> into SQL 2005.
> My Current (SQL 2000) Setup:
> --
> My C# app regularly collects performance information, then inserts it into
> a table. On insert in that table I have a trigger to kick off a couple
> stored procs that compare the new data to other tables where thresholds
> are set. As a simple example, it essentially does a "SELECT * FROM
> Thresholds WHERE @.NewlyInsertedValue > ThresholdValue" to determine if the
> current value is in violation of any thresholds. (There are a couple more
> columns that are considered that makes the query a little more intensive,
> but this is the general idea)
> If thresholds are violated, it inserts a record into an 'alerts' table and
> generates an email (by another stored proc). If no thresholds are
> violated, it runs a stored proc to clear any alerts that might have been
> created previously since we're not in violation anymore. Since this
> happens EVERY time a new value is collected, I really want to optimize
> this as much as possible.
> My Desired (SQL 2005) Setup:
> --
> I'm going to replace certain obvious functions like the "send email"
> function from old SQL objects to net CLR objects. However, I want to
> further optimize the system that compares incoming data with thresholds
> and generates alerts as necessary. It seems very wasteful to do selects
> every time a new value is inserted, and as the tables grow even with
> proper indexing it is going to get expensive. Is there a way I can take
> advantage of CLR to maybe keep these thresholds in memory to compare them
> against? Any other ideas about how to accomplish this most efficiently?
> Craig
> PS - I can post DDL if you really want it, I was just trying to get high
> level design ideas from someone that's done a lot of SQL CLR work already.
>|||First, you're right - everything I'm doing can be done in T-SQL (It is today
in SQL 2000). I just wanted to see if CLR would offer any optimization.
Secondly, I think you're right about caching the data and performing the
checks in the middle tier for violations (I'm using web services). From
what I understand I can actually subscribe to changes to a table/view in
2005 so that I could cache that info and just rebuild it any time the
thresholds table was updated. The thresholds data is pretty static, so this
could definately work.
I guess this is basically what I was trying to do, but for some reason
thought it needed to live in the db. Logic is always best left in the
middle/front tier anyway I guess.
So, good call Andrew. Anyone else have anything to add/recommend?
Craig
"Andrew J. Kelly" <sqlmvpnooospam@.shadhawk.com> wrote in message
news:O9eKD9XDGHA.3444@.TK2MSFTNGP10.phx.gbl...
> For general data manipulation especially Inserts, Updates, Deletes and
> normal Selects TSQL will always be the best choice. The CLR is useful for
> heavy calculations / aggregations or when you simply can't do something
> with TSQL. Of coarse anytime you need to visit the filesystem or need the
> equivalent of an extended sp the CLR is prime for that as well. AS for the
> mail component you can use the new DatabaseMail which works pretty well
> and a lot less troublesome than the MAPI version in 2000. But in general
> I suggest you use stored procedures for this instead of triggers. You have
> a fair amount of processing to do and you should keep triggers as trim as
> possible. I don't see any reason this logic can not be put into a
> standard TSQL stored procedure. But as for checking the values why not do
> that in the front end or middle tier before you even attempt the insert?
> That kind of information is pretty much static and can be cached very
> easily in the middle tier or front end. If these inserts are many you can
> save a lot of processing etc. on the back end by validating the data
> before you send it in.
> --
> Andrew J. Kelly SQL MVP
>
> "Craig S" <cscheets@.dontspam.kc.rr.com> wrote in message
> news:OIG$CfXDGHA.1544@.TK2MSFTNGP10.phx.gbl...
>|||Xref: TK2MSFTNGP08.phx.gbl microsoft.public.sqlserver.programming:574655
Craig S (cscheets@.dontspam.kc.rr.com) writes:
> My C# app regularly collects performance information, then inserts it
> into a table. On insert in that table I have a trigger to kick off a
> couple stored procs that compare the new data to other tables where
> thresholds are set. As a simple example, it essentially does a "SELECT *
> FROM Thresholds WHERE @.NewlyInsertedValue > ThresholdValue" to determine
> if the current value is in violation of any thresholds. (There are a
> couple more columns that are considered that makes the query a little
> more intensive, but this is the general idea)
As long you have full control over the data, using a stored procedure
to insert is better. Triggers are more difficult to write, and there
is one ugly thing about them: they can disappear without notice. If
a stored procedure is dropped inadvertenly, your application will
cry out, but if the trigger is missing, you will only get incorrect
results.
> It seems very wasteful to do selects every time a new value is inserted,
> and as the tables grow even with proper indexing it is going to get
> expensive. Is there a way I can take advantage of CLR to maybe keep
> these thresholds in memory to compare them against?
Maybe there is, but it would be a bad idea. Don't worry, that data
is in memory already. SQL Server takes care of that, and keeps as
much data in cache as possible.
> I guess this is basically what I was trying to do, but for some reason
> thought it needed to live in the db. Logic is always best left in the
> middle/front tier anyway I guess.
Personally, I'm a strong believer in having the business logic where
the database is.
Erland Sommarskog, SQL Server MVP, esquel@.sommarskog.se
Books Online for SQL Server 2005 at
http://www.microsoft.com/technet/pr...oads/books.mspx
Books Online for SQL Server 2000 at
http://www.microsoft.com/sql/prodin...ions/books.mspx
Showing posts with label upgrading. Show all posts
Showing posts with label upgrading. Show all posts
Sunday, February 19, 2012
Design Question
I am just getting into SQL reporting services and I am looking into upgrading some Access reports with SQL reporting services. In the Access reports there was an interface that allowed the user to select some different options and then generate the report based on some of those options. To generate the report there was a large amount of vba code that created some collector tables that were then used by the report. My question is what would be the best way to recreate this functionality in reporting services. Would it be best to just have an external assembly that I called and passed in the parameters, this assembly would then create and a populate any tables on the sql server that could then be used by the report to display the data? If this is the case how would I go about connecting to the database? Or would it be better to create a custom data processing extension to do this? I have not looked into the data processing extensions much yet so I am not sure if it is even possible to do this.
Thanks for your suggestions
--
Message posted via http://www.sqlmonster.comI am assuming here that the data you are reporting against is in SQL Server.
The way you have solved it today in Access essentially allows you to have
your own temporary tables. Your recordsets in your VBA code acted as
temporary tables. I think you would be better off to create a stored
procedure than to go either of the paths you mention below. A custom data
processing extension is not easy. Doable but non-trivial. One point, when
creating a stored procedure don't try to do a port of your vba code, rewrite
it. Your VBA code is most likely cursor based (walking through recordsets).
When doing SP you should try to do set based logic. Fill in temp tables,
doing joins with the temp table etc. I once took 64 pages of C code and
turned it into 2 pages of a stored procedure. You can use cursors in SP but
that should be a last resort.
With the next beta of Widbey their will be winform and webform controls that
might make this easier.
Bruce Loehle-Conger
MVP SQL Server Reporting Services
"Jason DeWeeese via SQLMonster.com" <forum@.SQLMonster.com> wrote in message
news:6626b9ec55f6440c865863d7d4c984dd@.SQLMonster.com...
> I am just getting into SQL reporting services and I am looking into
upgrading some Access reports with SQL reporting services. In the Access
reports there was an interface that allowed the user to select some
different options and then generate the report based on some of those
options. To generate the report there was a large amount of vba code that
created some collector tables that were then used by the report. My question
is what would be the best way to recreate this functionality in reporting
services. Would it be best to just have an external assembly that I called
and passed in the parameters, this assembly would then create and a populate
any tables on the sql server that could then be used by the report to
display the data? If this is the case how would I go about connecting to the
database? Or would it be better to create a custom data processing extension
to do this? I have not looked into the data processing extensions much yet
so I am not sure if it is even possible to do this.
> Thanks for your suggestions
> --
> Message posted via http://www.sqlmonster.com|||Thanks for your suggestion, that is one other alternative that I had briefly considered, but had not really looked into. Thanks
--
Message posted via http://www.sqlmonster.com
Thanks for your suggestions
--
Message posted via http://www.sqlmonster.comI am assuming here that the data you are reporting against is in SQL Server.
The way you have solved it today in Access essentially allows you to have
your own temporary tables. Your recordsets in your VBA code acted as
temporary tables. I think you would be better off to create a stored
procedure than to go either of the paths you mention below. A custom data
processing extension is not easy. Doable but non-trivial. One point, when
creating a stored procedure don't try to do a port of your vba code, rewrite
it. Your VBA code is most likely cursor based (walking through recordsets).
When doing SP you should try to do set based logic. Fill in temp tables,
doing joins with the temp table etc. I once took 64 pages of C code and
turned it into 2 pages of a stored procedure. You can use cursors in SP but
that should be a last resort.
With the next beta of Widbey their will be winform and webform controls that
might make this easier.
Bruce Loehle-Conger
MVP SQL Server Reporting Services
"Jason DeWeeese via SQLMonster.com" <forum@.SQLMonster.com> wrote in message
news:6626b9ec55f6440c865863d7d4c984dd@.SQLMonster.com...
> I am just getting into SQL reporting services and I am looking into
upgrading some Access reports with SQL reporting services. In the Access
reports there was an interface that allowed the user to select some
different options and then generate the report based on some of those
options. To generate the report there was a large amount of vba code that
created some collector tables that were then used by the report. My question
is what would be the best way to recreate this functionality in reporting
services. Would it be best to just have an external assembly that I called
and passed in the parameters, this assembly would then create and a populate
any tables on the sql server that could then be used by the report to
display the data? If this is the case how would I go about connecting to the
database? Or would it be better to create a custom data processing extension
to do this? I have not looked into the data processing extensions much yet
so I am not sure if it is even possible to do this.
> Thanks for your suggestions
> --
> Message posted via http://www.sqlmonster.com|||Thanks for your suggestion, that is one other alternative that I had briefly considered, but had not really looked into. Thanks
--
Message posted via http://www.sqlmonster.com
Subscribe to:
Posts (Atom)