Showing posts with label group. Show all posts
Showing posts with label group. Show all posts

Monday, March 26, 2012

Parent Package Variables and SQL Package Store

Hello All,

I have a Master Package which calls a group of other packages out on the SSIS Package Store using parent / child and a number of variables. These seem to work when passing various audit information (audit key, record counts, etc) but when I try to pass a variable for the connection string and assign it via an expression to the connection manager, I get the Master Package writing to the correct database and the children package writing to their "default" database which is supplied as the default value in the variables which should be populated by the parent task.

i.e., I end up with my master package audit information in database a and child package audit and transactions in database b, even though the packages associated with these transactions are supposed to be reading the connection string from a parent variable. Any clues or suggestions?

Edit: It is still passing along the correct parent audit information and record counts, it's just not connecting to the correct database by the variables holding the connect string

Thanks for your time.

After your issue the other day, it occurred to me that I do something differently, yet achieve the same thing.

I pass in pieces of the connection strings to separate variables. (ServerName, DatabaseName, etc...)

Then, on the connection manager objects, I use their expressions to build its connection string based on those variables. So I'm not passing in a connection string -- I'm just passing in pieces. Then the connection manager's expression for "ConnectionString" takes it from there. Does that make sense?|||

yeah, that makes sense... I've been thinking about doing that for about a day now myself, although I was just wanting to get this to work first before going through with it. I think I will go ahead and change things over to build up the string dynamically, that way I won't have to retype everything in the set values tab every time I want to tinker with which database or server it is going to.

Right now I am building a test package that is passing the connection string and it works fine, however the test package is passing it to a pacakge on the file server (not in the sql package store). There shouldn't be any difference in how a package accepts parent variables between a file server executed package and a sql store executed package should there?

|||

Well, I just spent a few hours converting those packages over to store the data source, initial catalog, name and password as seperatre variables and build the string up in the expression for the connection string. When I ran the package it was still pointing to the "default" values defined in the child package, even though I had changed the parent to point to a different data base within the source...

Any ideas?

|||

EWisdahl wrote:

Well, I just spent a few hours converting those packages over to store the data source, initial catalog, name and password as seperatre variables and build the string up in the expression for the connection string. When I ran the package it was still pointing to the "default" values defined in the child package, even though I had changed the parent to point to a different data base within the source...

Any ideas?

Are you certain that the parent package variable configuration is set up properly? You may want to try adding a Script Task to your control flow, and within it using a MessageBox to pop up the value being configured just to validate.

Personally, I solve this problem with XML file configurations. I have one dtsConfig file that drives the ConnectionString property of the connection, and have connection managers with the same name in my master and child packages, and this works great for me.

|||Also try setting DelayValidation to true on your connection manager that you are configuring.|||

I am fairly certain that I have them set up correctly in that I am also passing along other information such as audit id, filename, recordcount, etc and that is all populating correctly. It seems like it has predetermined what the connection managers connection string will be before the package receives the parent variable configurations...

|||

Phil Brammer wrote:

Also try setting DelayValidation to true on your connection manager that you are configuring.

Didn't seem to help any... Sad

|||

EWisdahl wrote:

Phil Brammer wrote:

Also try setting DelayValidation to true on your connection manager that you are configuring.

Didn't seem to help any...

Then make sure your parent package configurations are working correctly. Make sure you're mapping to the .Value parameter of your child variable. Also, CaSE matters, I believe. And I always just type in the name of the variable. No "User::" or "System::" prefix.|||

Double checked and triple checked. As I said, I am able to pass off certain values (auditkey, etc) no problem. As a matter of fact, I am able to pass the same connection string (or portions thereof to piece together into the connection string) if I am passing it to a package via the file system, it only seems to come up when passing parent package configuration variables from a parent to a child stored on the SSIS package store.

On a positive note, the set values through sql agent job maintenance works fine for the master package now ... If only it would pass these to the child packages (or if the child packages would pick it up from the master's "default") Tongue Tiedigh:

|||

After reading another thread I think I may have described something a bit inaccurately. I am storing these packages on the SSIS Server under the msdb subfolder (i.e. connect to integration server, click subnode stored packages, msdb, packagefolder -> import) as opposed to the package store (same as above but click on the file system within the stored packages node). Would that make any difference?

|||

Question: Are you using any other configurations in your child packages, besides the Parent Package Variables? If so, can you describe those?

|||

Nope, just the parent package variables.

|||

Well, i am currently facing something like this...

And i am starting to think that Parent Variables are somehow desync with the child package execution...

In my situation i have Execute SQL Task at the beggining of the package registering the inicial log record... and its beeing inserted with the default values on my child package instead of using the configured parent package values!

Later on i have that kind of connectionString mechanism and its working right... i think its a matter of delay... probably if you try putting a script task before using the connection, you will get your parent variables in time.

Had you tried it?

Just a though...

Regards

Parent Package Variables and SQL Package Store

Hello All,

I have a Master Package which calls a group of other packages out on the SSIS Package Store using parent / child and a number of variables. These seem to work when passing various audit information (audit key, record counts, etc) but when I try to pass a variable for the connection string and assign it via an expression to the connection manager, I get the Master Package writing to the correct database and the children package writing to their "default" database which is supplied as the default value in the variables which should be populated by the parent task.

i.e., I end up with my master package audit information in database a and child package audit and transactions in database b, even though the packages associated with these transactions are supposed to be reading the connection string from a parent variable. Any clues or suggestions?

Edit: It is still passing along the correct parent audit information and record counts, it's just not connecting to the correct database by the variables holding the connect string

Thanks for your time.

After your issue the other day, it occurred to me that I do something differently, yet achieve the same thing.

I pass in pieces of the connection strings to separate variables. (ServerName, DatabaseName, etc...)

Then, on the connection manager objects, I use their expressions to build its connection string based on those variables. So I'm not passing in a connection string -- I'm just passing in pieces. Then the connection manager's expression for "ConnectionString" takes it from there. Does that make sense?|||

yeah, that makes sense... I've been thinking about doing that for about a day now myself, although I was just wanting to get this to work first before going through with it. I think I will go ahead and change things over to build up the string dynamically, that way I won't have to retype everything in the set values tab every time I want to tinker with which database or server it is going to.

Right now I am building a test package that is passing the connection string and it works fine, however the test package is passing it to a pacakge on the file server (not in the sql package store). There shouldn't be any difference in how a package accepts parent variables between a file server executed package and a sql store executed package should there?

|||

Well, I just spent a few hours converting those packages over to store the data source, initial catalog, name and password as seperatre variables and build the string up in the expression for the connection string. When I ran the package it was still pointing to the "default" values defined in the child package, even though I had changed the parent to point to a different data base within the source...

Any ideas?

|||

EWisdahl wrote:

Well, I just spent a few hours converting those packages over to store the data source, initial catalog, name and password as seperatre variables and build the string up in the expression for the connection string. When I ran the package it was still pointing to the "default" values defined in the child package, even though I had changed the parent to point to a different data base within the source...

Any ideas?

Are you certain that the parent package variable configuration is set up properly? You may want to try adding a Script Task to your control flow, and within it using a MessageBox to pop up the value being configured just to validate.

Personally, I solve this problem with XML file configurations. I have one dtsConfig file that drives the ConnectionString property of the connection, and have connection managers with the same name in my master and child packages, and this works great for me.

|||Also try setting DelayValidation to true on your connection manager that you are configuring.|||

I am fairly certain that I have them set up correctly in that I am also passing along other information such as audit id, filename, recordcount, etc and that is all populating correctly. It seems like it has predetermined what the connection managers connection string will be before the package receives the parent variable configurations...

|||

Phil Brammer wrote:

Also try setting DelayValidation to true on your connection manager that you are configuring.

Didn't seem to help any... Sad

|||

EWisdahl wrote:

Phil Brammer wrote:

Also try setting DelayValidation to true on your connection manager that you are configuring.

Didn't seem to help any...

Then make sure your parent package configurations are working correctly. Make sure you're mapping to the .Value parameter of your child variable. Also, CaSE matters, I believe. And I always just type in the name of the variable. No "User::" or "System::" prefix.|||

Double checked and triple checked. As I said, I am able to pass off certain values (auditkey, etc) no problem. As a matter of fact, I am able to pass the same connection string (or portions thereof to piece together into the connection string) if I am passing it to a package via the file system, it only seems to come up when passing parent package configuration variables from a parent to a child stored on the SSIS package store.

On a positive note, the set values through sql agent job maintenance works fine for the master package now ... If only it would pass these to the child packages (or if the child packages would pick it up from the master's "default") Tongue Tiedigh:

|||

After reading another thread I think I may have described something a bit inaccurately. I am storing these packages on the SSIS Server under the msdb subfolder (i.e. connect to integration server, click subnode stored packages, msdb, packagefolder -> import) as opposed to the package store (same as above but click on the file system within the stored packages node). Would that make any difference?

|||

Question: Are you using any other configurations in your child packages, besides the Parent Package Variables? If so, can you describe those?

|||

Nope, just the parent package variables.

|||

Well, i am currently facing something like this...

And i am starting to think that Parent Variables are somehow desync with the child package execution...

In my situation i have Execute SQL Task at the beggining of the package registering the inicial log record... and its beeing inserted with the default values on my child package instead of using the configured parent package values!

Later on i have that kind of connectionString mechanism and its working right... i think its a matter of delay... probably if you try putting a script task before using the connection, you will get your parent variables in time.

Had you tried it?

Just a though...

Regards

Parent Package Variables and SQL Package Store

Hello All,

I have a Master Package which calls a group of other packages out on the SSIS Package Store using parent / child and a number of variables. These seem to work when passing various audit information (audit key, record counts, etc) but when I try to pass a variable for the connection string and assign it via an expression to the connection manager, I get the Master Package writing to the correct database and the children package writing to their "default" database which is supplied as the default value in the variables which should be populated by the parent task.

i.e., I end up with my master package audit information in database a and child package audit and transactions in database b, even though the packages associated with these transactions are supposed to be reading the connection string from a parent variable. Any clues or suggestions?

Edit: It is still passing along the correct parent audit information and record counts, it's just not connecting to the correct database by the variables holding the connect string

Thanks for your time.

After your issue the other day, it occurred to me that I do something differently, yet achieve the same thing.

I pass in pieces of the connection strings to separate variables. (ServerName, DatabaseName, etc...)

Then, on the connection manager objects, I use their expressions to build its connection string based on those variables. So I'm not passing in a connection string -- I'm just passing in pieces. Then the connection manager's expression for "ConnectionString" takes it from there. Does that make sense?|||

yeah, that makes sense... I've been thinking about doing that for about a day now myself, although I was just wanting to get this to work first before going through with it. I think I will go ahead and change things over to build up the string dynamically, that way I won't have to retype everything in the set values tab every time I want to tinker with which database or server it is going to.

Right now I am building a test package that is passing the connection string and it works fine, however the test package is passing it to a pacakge on the file server (not in the sql package store). There shouldn't be any difference in how a package accepts parent variables between a file server executed package and a sql store executed package should there?

|||

Well, I just spent a few hours converting those packages over to store the data source, initial catalog, name and password as seperatre variables and build the string up in the expression for the connection string. When I ran the package it was still pointing to the "default" values defined in the child package, even though I had changed the parent to point to a different data base within the source...

Any ideas?

|||

EWisdahl wrote:

Well, I just spent a few hours converting those packages over to store the data source, initial catalog, name and password as seperatre variables and build the string up in the expression for the connection string. When I ran the package it was still pointing to the "default" values defined in the child package, even though I had changed the parent to point to a different data base within the source...

Any ideas?

Are you certain that the parent package variable configuration is set up properly? You may want to try adding a Script Task to your control flow, and within it using a MessageBox to pop up the value being configured just to validate.

Personally, I solve this problem with XML file configurations. I have one dtsConfig file that drives the ConnectionString property of the connection, and have connection managers with the same name in my master and child packages, and this works great for me.

|||Also try setting DelayValidation to true on your connection manager that you are configuring.|||

I am fairly certain that I have them set up correctly in that I am also passing along other information such as audit id, filename, recordcount, etc and that is all populating correctly. It seems like it has predetermined what the connection managers connection string will be before the package receives the parent variable configurations...

|||

Phil Brammer wrote:

Also try setting DelayValidation to true on your connection manager that you are configuring.

Didn't seem to help any... Sad

|||

EWisdahl wrote:

Phil Brammer wrote:

Also try setting DelayValidation to true on your connection manager that you are configuring.

Didn't seem to help any...

Then make sure your parent package configurations are working correctly. Make sure you're mapping to the .Value parameter of your child variable. Also, CaSE matters, I believe. And I always just type in the name of the variable. No "User::" or "System::" prefix.|||

Double checked and triple checked. As I said, I am able to pass off certain values (auditkey, etc) no problem. As a matter of fact, I am able to pass the same connection string (or portions thereof to piece together into the connection string) if I am passing it to a package via the file system, it only seems to come up when passing parent package configuration variables from a parent to a child stored on the SSIS package store.

On a positive note, the set values through sql agent job maintenance works fine for the master package now ... If only it would pass these to the child packages (or if the child packages would pick it up from the master's "default") Tongue Tiedigh:

|||

After reading another thread I think I may have described something a bit inaccurately. I am storing these packages on the SSIS Server under the msdb subfolder (i.e. connect to integration server, click subnode stored packages, msdb, packagefolder -> import) as opposed to the package store (same as above but click on the file system within the stored packages node). Would that make any difference?

|||

Question: Are you using any other configurations in your child packages, besides the Parent Package Variables? If so, can you describe those?

|||

Nope, just the parent package variables.

|||

Well, i am currently facing something like this...

And i am starting to think that Parent Variables are somehow desync with the child package execution...

In my situation i have Execute SQL Task at the beggining of the package registering the inicial log record... and its beeing inserted with the default values on my child package instead of using the configured parent package values!

Later on i have that kind of connectionString mechanism and its working right... i think its a matter of delay... probably if you try putting a script task before using the connection, you will get your parent variables in time.

Had you tried it?

Just a though...

Regards

Wednesday, March 21, 2012

Partitioning - Execution Plan

Hello group,
I'm considering creating a Partitioned view on a Sales table that
contains up to 20 Millions records. I'm doing some test and have a result
I'm questionning when looking at the Execution Plan generated.
First, let me explain a bit my structure. In my Sales table, I have
Ex-Factory sales information and Externally provided information. There
distinguised using an identifier (0=Ex Factory, 1=External).
I've created 8 Tables
SalesTSA1999 Check Constraint (Origin=1 And BilledDate
BETWEEN '19990101' And '19991231')
SalesTSA2000 Check Constraint (Origin=1 And BilledDate
BETWEEN '20000101' And '20001231')
SalesTSA2001 Check Constraint (Origin=1 And BilledDate
BETWEEN '20010101' And '20011231')
SalesTSA2002 Check Constraint (Origin=1 And BilledDate
BETWEEN '20020101' And '20021231')
SalesXFactory1999 Check Constraint (Origin=0 And BilledDate BETWEEN
'19990101' And '19991231')
SalesXFactory2000 Check Constraint (Origin=0 And BilledDate BETWEEN
'20000101' And '20001231')
SalesXFactory2001 Check Constraint (Origin=0 And BilledDate BETWEEN
'20010101' And '20011231')
SalesXFactory2002 Check Constraint (Origin=0 And BilledDate BETWEEN
'20020101' And '20021231')
and a view that include all theses tables using a UNION ALL:
viewSalesPartitioned
Then, I'm issuing the following Query:
Select Sum(Amount)
From viewSalesPartitioned
Where Origin=1 AND BilledDate BETWEEN '20020101' AND '20021231' --No worry
about Time, it's all set to 00:00:00
When looking at the Execution Plan in Query Analyzer I can see the execution
plan is "almost" getting the perfect query. I mean, on each table I can see
a "Filter Cost 0%" Except for SalesXFactory2002 where there's a table scan
with a cost of 6%. In my opinion it should do a filter like on the other
tables based on the check constraint on the column Origin. Maybe there's
something I don't understand properly (I'm just starting using the Execution
Plan analysis which is sometimes a bit confusing for me...)
Any help would be appreciated. Thanks!Christian Hamel wrote:
> Hello group,
> I'm considering creating a Partitioned view on a Sales table that
> contains up to 20 Millions records. I'm doing some test and have a
> result I'm questionning when looking at the Execution Plan generated.
> First, let me explain a bit my structure. In my Sales table, I have
> Ex-Factory sales information and Externally provided information.
> There distinguised using an identifier (0=Ex Factory, 1=External).
> I've created 8 Tables
> SalesTSA1999 Check Constraint (Origin=1 And BilledDate
> BETWEEN '19990101' And '19991231')
> SalesTSA2000 Check Constraint (Origin=1 And BilledDate
> BETWEEN '20000101' And '20001231')
> SalesTSA2001 Check Constraint (Origin=1 And BilledDate
> BETWEEN '20010101' And '20011231')
> SalesTSA2002 Check Constraint (Origin=1 And BilledDate
> BETWEEN '20020101' And '20021231')
> SalesXFactory1999 Check Constraint (Origin=0 And BilledDate
> BETWEEN '19990101' And '19991231')
> SalesXFactory2000 Check Constraint (Origin=0 And BilledDate
> BETWEEN '20000101' And '20001231')
> SalesXFactory2001 Check Constraint (Origin=0 And BilledDate
> BETWEEN '20010101' And '20011231')
> SalesXFactory2002 Check Constraint (Origin=0 And BilledDate
> BETWEEN '20020101' And '20021231')
> and a view that include all theses tables using a UNION ALL:
> viewSalesPartitioned
> Then, I'm issuing the following Query:
> Select Sum(Amount)
> From viewSalesPartitioned
> Where Origin=1 AND BilledDate BETWEEN '20020101' AND '20021231' --No
> worry about Time, it's all set to 00:00:00
> When looking at the Execution Plan in Query Analyzer I can see the
> execution plan is "almost" getting the perfect query. I mean, on
> each table I can see a "Filter Cost 0%" Except for SalesXFactory2002
> where there's a table scan with a cost of 6%. In my opinion it
> should do a filter like on the other tables based on the check
> constraint on the column Origin. Maybe there's something I don't
> understand properly (I'm just starting using the Execution Plan
> analysis which is sometimes a bit confusing for me...) Any help would
> be appreciated. Thanks!
Indexing on a character or bit value that allows only two values will
normally not add any performance benefit because the column selectivity
is too low. You might be better off using the date as the patitioning
column without the Origin. You could also try adding a hint to force SQL
Server to use the clustered index.
I would also change the check constraints to avoid any possibility of a
time portion triggering an error:
Check Constraint (Origin=1 And BilledDate >= '20000101' And BilledDate <
'20010101')
David Gugick
Quest Software
www.imceda.com
www.quest.com|||"Christian Hamel" <chamel@.NoSpAmMeRaLlOwEd.CoM> wrote in message
news:O%23AEf7exFHA.3720@.TK2MSFTNGP11.phx.gbl...
> Hello group,
> I'm considering creating a Partitioned view on a Sales table that
> contains up to 20 Millions records. I'm doing some test and have a result
> I'm questionning when looking at the Execution Plan generated.
> First, let me explain a bit my structure. In my Sales table, I have
> Ex-Factory sales information and Externally provided information. There
> distinguised using an identifier (0=Ex Factory, 1=External).
> I've created 8 Tables
> SalesTSA1999 Check Constraint (Origin=1 And BilledDate
> BETWEEN '19990101' And '19991231')
> SalesTSA2000 Check Constraint (Origin=1 And BilledDate
> BETWEEN '20000101' And '20001231')
> SalesTSA2001 Check Constraint (Origin=1 And BilledDate
> BETWEEN '20010101' And '20011231')
> SalesTSA2002 Check Constraint (Origin=1 And BilledDate
> BETWEEN '20020101' And '20021231')
> SalesXFactory1999 Check Constraint (Origin=0 And BilledDate BETWEEN
> '19990101' And '19991231')
> SalesXFactory2000 Check Constraint (Origin=0 And BilledDate BETWEEN
> '20000101' And '20001231')
> SalesXFactory2001 Check Constraint (Origin=0 And BilledDate BETWEEN
> '20010101' And '20011231')
> SalesXFactory2002 Check Constraint (Origin=0 And BilledDate BETWEEN
> '20020101' And '20021231')
> and a view that include all theses tables using a UNION ALL:
> viewSalesPartitioned
> Then, I'm issuing the following Query:
> Select Sum(Amount)
> From viewSalesPartitioned
> Where Origin=1 AND BilledDate BETWEEN '20020101' AND '20021231' --No worry
> about Time, it's all set to 00:00:00
> When looking at the Execution Plan in Query Analyzer I can see the
> execution
> plan is "almost" getting the perfect query. I mean, on each table I can
> see
> a "Filter Cost 0%" Except for SalesXFactory2002 where there's a table scan
> with a cost of 6%. In my opinion it should do a filter like on the other
> tables based on the check constraint on the column Origin. Maybe there's
> something I don't understand properly (I'm just starting using the
> Execution
> Plan analysis which is sometimes a bit confusing for me...)
> Any help would be appreciated. Thanks!
>
Partitioned views only support a single partitioning column.
So the check constraints should be just on the date. And in the query plan
you shouldn't even see the other partitions. What you're currently seeing
is not true partition elimination. The query is just hitting each PK index
and quickly eliminating the partitions which don't contain the that column.
Partitioned views allow the data in a large table to be split into smaller
member tables. The data is partitioned between the member tables based on
ranges of data values in one of the columns. The data ranges for each member
table are defined in a CHECK constraint specified on the partitioning
column.
http://msdn2.microsoft.com/ms248875
If you add an index on Origin, or perhaps add Origin as well as BilledDate
to your primary key, you should be able to eliminate the table scan. You
will get partition elimination to aviod hitting the other time period, but
normal index s behavior to eliminate the other Origin.
David|||My TinyInt column is not part of the index.
I thought that since there was a Check constraint on the table that allow
only a specific value, SQL Server would be kind enough to don't scan the
records based on the constraint and the value specified in my query.
Thanks for the reply.
"David Gugick" <david.gugick-nospam@.quest.com> a crit dans le message de
news: %23NC0dEfxFHA.3236@.TK2MSFTNGP14.phx.gbl...
> Christian Hamel wrote:
> Indexing on a character or bit value that allows only two values will
> normally not add any performance benefit because the column selectivity is
> too low. You might be better off using the date as the patitioning column
> without the Origin. You could also try adding a hint to force SQL Server
> to use the clustered index.
> I would also change the check constraints to avoid any possibility of a
> time portion triggering an error:
> Check Constraint (Origin=1 And BilledDate >= '20000101' And BilledDate <
> '20010101')
>
> --
> David Gugick
> Quest Software
> www.imceda.com
> www.quest.com|||Ok, I see then. I'll check what I can do.
Thank you very much. I'll go to bed more intelligent tonight :)
"David Browne" <davidbaxterbrowne no potted meat@.hotmail.com> a crit dans
le message de news: uzk3sGfxFHA.1032@.TK2MSFTNGP12.phx.gbl...
> "Christian Hamel" <chamel@.NoSpAmMeRaLlOwEd.CoM> wrote in message
> news:O%23AEf7exFHA.3720@.TK2MSFTNGP11.phx.gbl...
> Partitioned views only support a single partitioning column.
> So the check constraints should be just on the date. And in the query
> plan you shouldn't even see the other partitions. What you're currently
> seeing is not true partition elimination. The query is just hitting each
> PK index and quickly eliminating the partitions which don't contain the
> that column.
> Partitioned views allow the data in a large table to be split into smaller
> member tables. The data is partitioned between the member tables based on
> ranges of data values in one of the columns. The data ranges for each
> member table are defined in a CHECK constraint specified on the
> partitioning column.
> http://msdn2.microsoft.com/ms248875
> If you add an index on Origin, or perhaps add Origin as well as BilledDate
> to your primary key, you should be able to eliminate the table scan. You
> will get partition elimination to aviod hitting the other time period, but
> normal index s behavior to eliminate the other Origin.
> David
>
>

Tuesday, March 20, 2012

Parameters: How to set null value

Hi, I have a problem.
I have two parameters: Group and Selection

In the first I have 2 choices: Total and Day
And the value of the second parameter is dynamic and is sets to allow null value
The behaviour I want is this: If a user select Total in the first parameter, the second must be set to NULL, if the selection is Day, the second must be set with a data set.

How I can set the value of the second parameter with null, I tryed

Have anyone an idea?

One way I have done this in the past is using '-1' as the Total value, and then letting the stored procedure pass in a null value by setting all '-1' parameters to null.

if @.param1 = '-1' set @.param1=null

System.DbNull.Value may work too.

Yet another option that may be possible is to use an expression to convert the parameter to a null value, and pass in the expression instead of the parameter to the report.

cheers,

Andrew

|||thanks,
I tryed System.DbNull.Value but give me this error : DbNull is a type in system and cannot be used as an expression

Perhaps I wrong something:
I wrote this in the datasets I use for the second parameter

=iif(Parameters!parm1.Value = 2,
"select a11.DATE_ID from Day" ,System.DBNull )

Another question, where I have to place this expression, in the store procedure?

if @.param1 = '-1' set @.param1=null

but I use a dataset.

Thanks in advance

|||

Not sure exactly how you are passing in a dataset as a parameter but looks interesting.

Instead of null perhaps you could use "SELECT null as DateID" instead? Or maybe an empty string?

The syntax for getting dbnull is to use System.DBNull.Value

=iif(Parameters!parm1.Value = 2,
"select a11.DATE_ID from Day" ,System.DBNull.Value )

cheers,

Andrew

|||If I use this expression "System.DBNull.Value" something don't work but I don't understand what.
but following your example and this link forums.microsoft.com/MSDN/ShowPost.aspx?PostID=766574&SiteID=1
it works

thanks

Monday, March 12, 2012

Parameters to Subreports within a matrix

We have a recurring project status report that uses a matrix (each matrix group is for an individual project). The name of the project renders fine and displays in the group, but when we try to pass this same value as a parameter to a subreport (a graph) which we also want to display in the matrix the subreport is only renderd for the last instance of the matrix group (i.e. 4 of 4 has it but 1-3 show a blank field in the matrix).

I assume this has something to do with using parameters within the matrix control, but didn't find any posts about that.

I should add another way we've tried to do this is with a list instead of a matrix and it also fails in the same way (doesn't render all of the subreports).

Thanks for the help,

-p

This may be caued by an interaction between SQL and the Dundas control I'm using. I'm not 100% sure, but thought I'd post this if anyone out there is having a similar problem.

http://support.dundas.com/forum/m.aspx?m=1978&mpage=1&key=subreport

I'll post an answer if I figure out one.

|||As mentioned in the link SP2 fixed this.

Saturday, February 25, 2012

Parameters and Formulas in a group field

I have a formula in my group footer that I need to pass to a paramter. The formula doesn't come up in my select expert box? When I try to manually add it, it gives me this error
"This formula cannot be used because it must be evaluated later"

my formula is a account balance formula and I have a parameter that I want to choose where the balance is greater than ".01"

How can I get this accomplished?So you want to use the result of the group footer formula in the record selection formula?
What is your formula? You might be able to use the group selection formula instead.
e.g. if you're grouping on account and summing a value field your group selection formula could be something like
sum({value}, {account}) > 0
which will only return accounts whose 'balance' is > 0

Parameters

Hi all,
Is it possible to group parameters fields into logical groups in report
view?
eg.
filters:
start date:
end date:
colors:
table border:
chart background:
...
TIA,
KamelThere would be nice to have separate control to manage parameters
presentation and logic.
Sometimes there is a need to have a lot of parameters (over 20) and
there is a presentation problem in reportviewer.
Kamel
kamel wrote:
> Hi all,
> Is it possible to group parameters fields into logical groups in report
> view?
> eg.
> filters:
> start date:
> end date:
> colors:
> table border:
> chart background:
> ...
> TIA,
> Kamel

Parameterizing page break in a group

We would like to have the Page Break for a group be based on a Yes/No
Parameter that would be a the parameter section of the report. In the
Grouping and sorting options where the "Page Break at End" check box is,
there is no way to enter an expression. Is there any way to do this without
writing code, and/or what would be the code that would make the Page Break
conditional?I think this will work though I haven't tried it. Put a Rectangle in the
bottom of your group footer with PageBreakAtEnd set to true. Then set the
Rectangle.Hidden expression (under visiblility) to =not
cbool(Parameters!PageBreaks.Value) assuming you have a parameter called
PageBreaks.
"Alec Hardy" <AlecHardy@.discussions.microsoft.com> wrote in message
news:E730E694-6B20-4A4A-86A8-2BF05EF0DB84@.microsoft.com...
> We would like to have the Page Break for a group be based on a Yes/No
> Parameter that would be a the parameter section of the report. In the
> Grouping and sorting options where the "Page Break at End" check box is,
> there is no way to enter an expression. Is there any way to do this
> without
> writing code, and/or what would be the code that would make the Page Break
> conditional?|||Thanks Bob, we're going to look into it.
"Bob Fisher" wrote:
> I think this will work though I haven't tried it. Put a Rectangle in the
> bottom of your group footer with PageBreakAtEnd set to true. Then set the
> Rectangle.Hidden expression (under visiblility) to =not
> cbool(Parameters!PageBreaks.Value) assuming you have a parameter called
> PageBreaks.
> "Alec Hardy" <AlecHardy@.discussions.microsoft.com> wrote in message
> news:E730E694-6B20-4A4A-86A8-2BF05EF0DB84@.microsoft.com...
> > We would like to have the Page Break for a group be based on a Yes/No
> > Parameter that would be a the parameter section of the report. In the
> > Grouping and sorting options where the "Page Break at End" check box is,
> > there is no way to enter an expression. Is there any way to do this
> > without
> > writing code, and/or what would be the code that would make the Page Break
> > conditional?
>
>|||Can anyone verify if this worked?
I tried entering and expression that would evaluate to true or false in the
element, but it did not work:
<PageBreakAtEnd>=CBool(iif(Parameters!GroupBreak1.Value = "true", true,
false))<PageBreakAtEnd>
Thanks!!!!
-Brian
"Alec Hardy" wrote:
> Thanks Bob, we're going to look into it.
> "Bob Fisher" wrote:
> > I think this will work though I haven't tried it. Put a Rectangle in the
> > bottom of your group footer with PageBreakAtEnd set to true. Then set the
> > Rectangle.Hidden expression (under visiblility) to =not
> > cbool(Parameters!PageBreaks.Value) assuming you have a parameter called
> > PageBreaks.
> >
> > "Alec Hardy" <AlecHardy@.discussions.microsoft.com> wrote in message
> > news:E730E694-6B20-4A4A-86A8-2BF05EF0DB84@.microsoft.com...
> > > We would like to have the Page Break for a group be based on a Yes/No
> > > Parameter that would be a the parameter section of the report. In the
> > > Grouping and sorting options where the "Page Break at End" check box is,
> > > there is no way to enter an expression. Is there any way to do this
> > > without
> > > writing code, and/or what would be the code that would make the Page Break
> > > conditional?
> >
> >
> >