Showing posts with label created. Show all posts
Showing posts with label created. Show all posts

Friday, March 30, 2012

please help me figure out how best to approach my queue problem

Hi,

I am moving my system from a custom created queue to a service broker queue.

I wish to have two queues :

one to hold processing messages
and one to hold emailing messages.

I do not understand why i would have to have an initiator queue and a target queue for each of the above. I have arranged my sql as follows:

-- processing queue
CREATE MESSAGE TYPE ProcessingMessage VALIDATION = NONE
CREATE CONTRACT ProcessingContract (ProcessingMessage SENT BY INITIATOR)
CREATE QUEUE ProcessingQueue
CREATE SERVICE ProcessingService ON QUEUE ProcessingQueue (ProcessingContract)
GO

-- emailing queue
CREATE MESSAGE TYPE EmailingMessage VALIDATION = NONE
CREATE CONTRACT EmailingContract (EmailingMessage SENT BY INITIATOR)
CREATE QUEUE EmailingQueue
CREATE SERVICE EmailingService ON QUEUE EmailingQueue (EmailingContract)
GO

So basically EmailingQueue plays the role of the initiator and the target ( sends messages to itself )... and so does the ProcessingQueue.

I enqueue my messages with the following SP:

PROC [dbo].[queue_enqueue] (
@.fromService SYSNAME,
@.toService SYSNAME,
@.onContract SYSNAME,
@.messageType SYSNAME,
@.entryId int)
AS
BEGIN
DECLARE @.conversationHandle UNIQUEIDENTIFIER
DECLARE @.error int

BEGIN DIALOG @.conversationHandle
FROM SERVICE @.fromService
TO SERVICE @.toService
ON CONTRACT @.onContract
with LIFETIME = 6000, ENCRYPTION = off;

SEND ON CONVERSATION @.conversationHandle Message Type @.messageType (@.entryId)

END

I do the enqueueing with
[queue_enqueue] 'ProcessingService', N'ProcessingService', 'ProcessingContract', 'ProcessingMessage', 1

I dequeue my messages with the following SP:
PROC [dbo].[queue_dequeue]
@.queue SYSNAME,
@.entryId int OUTPUT
AS
BEGIN
DECLARE @.conversationHandle UNIQUEIDENTIFIER;
DECLARE @.messageTypeName SYSNAME;
DECLARE @.conversationGroupId UNIQUEIDENTIFIER;
get conversation group @.conversationGroupId from ProcessingQueue

if ( @.conversationGroupId is not null )
BEGIN
if (@.queue='ProcessingQueue')
RECEIVE TOP(1) @.entryId = CONVERT(INT, [message_body]), @.conversationHandle = [conversation_handle], @.messageTypeName = [message_type_name] FROM ProcessingQueue where conversation_group_id = @.conversationGroupId
else if (@.queue='EmailingQueue')
RECEIVE TOP(1) @.entryId = CONVERT(INT, [message_body]), @.conversationHandle = [conversation_handle], @.messageTypeName = [message_type_name] FROM EmailingQueue where conversation_group_id = @.conversationGroupId
END
END

I dequeue by calling something like:
declare @.entryId int
exec [queue_dequeue] 'ProcessingQueue', @.entryId output

The above works however I have a few issues that I am having problems figuring out.

1. I don't want to "fire and forget". I want to close conversation at initiator endpoint only when target has closed conversation.
2. I want to know how to properly close conversations at initiator and target.
3. I am not polling with while loop or wait for, because of the way my system ( higher up ) is setup, I simply want to enqueue one message, and dequeue one message. If an error occurs in enqueueing or dequeueing I want to raise that error.
4. How do I handle poison messages?
5. Is it better for me to use two queues ( initiator and target ) for each of my queues above?

Can someone please help me with these issues?

thanx

Are you sure the code is correct? You are retrieving the conversation group from one queue, but potentially receive from another. Besides, GET CONVERSATION GROUP locks the conversation solely for the duration of the transaciton, yif you don't wrap the call to [queue_dequeue] in a transaction, you can have errors if multiple threads are calling this procedure. Also, using GET CONVERSATION GROUP is recommended solely for the cases when the application needs to look up some application specific state (corellated with the converstion_group_id) before actually RECEIVEing the messages. Otherwise one can simply go ahead and call RECEIVE verb directly.

The Service Broker queues are not identical with the Computer Science 'queue' data structures (i.e. a FIFO list). They are more like message storage areas for services and services are endpoints for distributed applications, communicating through messages. As such, using Service Broker queues probably won't map exactly 1-to-1 with your previous user-table-queue code.

You already noticed that you now have to call END CONVERSATION. I'm not sure how the rest of the system works: are you reusing converstions when enqueueing? is each enqueue a separate dialog?. Probably the [queue_dequeue] would have to call END CONVERSATION to end the conversations on which it received the current message. As long as you keep initiator and target on the same queue, this means that the same [queue_dequeue] procedure will have to handle the EndDialog messages (that would be one reason why you should split the queues).

|||Hi Remus, thanks for your reply!

I admit, I used the GET CONVERSATION GROUP when i got desperate.. thought that RECEIVE wasn't locking the queue entry on its own. I'll get rid of it since we don't really need to reuse conversations, we don't have a message intensive system, we are mostly worried about message intergrity and reduced processor load as compared to our previous system.

So, I gather it is better for me to setup my queues in the service broker conventional way, two queues for each physical queue that I have.

I still dont understand fully how the "dont fire and forget" method works:

1. Initiator starts dialog and sends message. does not end dialog.
2. Target receives message and ends dialog which in turn sends EndDialog message to initiator.
3. Initiator activation SP recieves EndDialog message and ends dialog on initiator side?

am I missing something?

thanx once again for your help.

please help me figure out how best to approach my queue problem

Hi,

I am moving my system from a custom created queue to a service broker queue.

I wish to have two queues :

one to hold processing messages
and one to hold emailing messages.

I do not understand why i would have to have an initiator queue and a target queue for each of the above. I have arranged my sql as follows:

-- processing queue
CREATE MESSAGE TYPE ProcessingMessage VALIDATION = NONE
CREATE CONTRACT ProcessingContract (ProcessingMessage SENT BY INITIATOR)
CREATE QUEUE ProcessingQueue
CREATE SERVICE ProcessingService ON QUEUE ProcessingQueue (ProcessingContract)
GO

-- emailing queue
CREATE MESSAGE TYPE EmailingMessage VALIDATION = NONE
CREATE CONTRACT EmailingContract (EmailingMessage SENT BY INITIATOR)
CREATE QUEUE EmailingQueue
CREATE SERVICE EmailingService ON QUEUE EmailingQueue (EmailingContract)
GO

So basically EmailingQueue plays the role of the initiator and the target ( sends messages to itself )... and so does the ProcessingQueue.

I enqueue my messages with the following SP:

PROC [dbo].[queue_enqueue] (
@.fromService SYSNAME,
@.toService SYSNAME,
@.onContract SYSNAME,
@.messageType SYSNAME,
@.entryId int)
AS
BEGIN
DECLARE @.conversationHandle UNIQUEIDENTIFIER
DECLARE @.error int

BEGIN DIALOG @.conversationHandle
FROM SERVICE @.fromService
TO SERVICE @.toService
ON CONTRACT @.onContract
with LIFETIME = 6000, ENCRYPTION = off;

SEND ON CONVERSATION @.conversationHandle Message Type @.messageType (@.entryId)

END

I do the enqueueing with
[queue_enqueue] 'ProcessingService', N'ProcessingService', 'ProcessingContract', 'ProcessingMessage', 1

I dequeue my messages with the following SP:
PROC [dbo].[queue_dequeue]
@.queue SYSNAME,
@.entryId int OUTPUT
AS
BEGIN
DECLARE @.conversationHandle UNIQUEIDENTIFIER;
DECLARE @.messageTypeName SYSNAME;
DECLARE @.conversationGroupId UNIQUEIDENTIFIER;
get conversation group @.conversationGroupId from ProcessingQueue

if ( @.conversationGroupId is not null )
BEGIN
if (@.queue='ProcessingQueue')
RECEIVE TOP(1) @.entryId = CONVERT(INT, [message_body]), @.conversationHandle = [conversation_handle], @.messageTypeName = [message_type_name] FROM ProcessingQueue where conversation_group_id = @.conversationGroupId
else if (@.queue='EmailingQueue')
RECEIVE TOP(1) @.entryId = CONVERT(INT, [message_body]), @.conversationHandle = [conversation_handle], @.messageTypeName = [message_type_name] FROM EmailingQueue where conversation_group_id = @.conversationGroupId
END
END

I dequeue by calling something like:
declare @.entryId int
exec [queue_dequeue] 'ProcessingQueue', @.entryId output

The above works however I have a few issues that I am having problems figuring out.

1. I don't want to "fire and forget". I want to close conversation at initiator endpoint only when target has closed conversation.
2. I want to know how to properly close conversations at initiator and target.
3. I am not polling with while loop or wait for, because of the way my system ( higher up ) is setup, I simply want to enqueue one message, and dequeue one message. If an error occurs in enqueueing or dequeueing I want to raise that error.
4. How do I handle poison messages?
5. Is it better for me to use two queues ( initiator and target ) for each of my queues above?

Can someone please help me with these issues?

thanx

Are you sure the code is correct? You are retrieving the conversation group from one queue, but potentially receive from another. Besides, GET CONVERSATION GROUP locks the conversation solely for the duration of the transaciton, yif you don't wrap the call to [queue_dequeue] in a transaction, you can have errors if multiple threads are calling this procedure. Also, using GET CONVERSATION GROUP is recommended solely for the cases when the application needs to look up some application specific state (corellated with the converstion_group_id) before actually RECEIVEing the messages. Otherwise one can simply go ahead and call RECEIVE verb directly.

The Service Broker queues are not identical with the Computer Science 'queue' data structures (i.e. a FIFO list). They are more like message storage areas for services and services are endpoints for distributed applications, communicating through messages. As such, using Service Broker queues probably won't map exactly 1-to-1 with your previous user-table-queue code.

You already noticed that you now have to call END CONVERSATION. I'm not sure how the rest of the system works: are you reusing converstions when enqueueing? is each enqueue a separate dialog?. Probably the [queue_dequeue] would have to call END CONVERSATION to end the conversations on which it received the current message. As long as you keep initiator and target on the same queue, this means that the same [queue_dequeue] procedure will have to handle the EndDialog messages (that would be one reason why you should split the queues).

|||Hi Remus, thanks for your reply!

I admit, I used the GET CONVERSATION GROUP when i got desperate.. thought that RECEIVE wasn't locking the queue entry on its own. I'll get rid of it since we don't really need to reuse conversations, we don't have a message intensive system, we are mostly worried about message intergrity and reduced processor load as compared to our previous system.

So, I gather it is better for me to setup my queues in the service broker conventional way, two queues for each physical queue that I have.

I still dont understand fully how the "dont fire and forget" method works:

1. Initiator starts dialog and sends message. does not end dialog.
2. Target receives message and ends dialog which in turn sends EndDialog message to initiator.
3. Initiator activation SP recieves EndDialog message and ends dialog on initiator side?

am I missing something?

thanx once again for your help.

Wednesday, March 28, 2012

Please Help - User Authentication Problem

Hi Lads,
I'm having this severe problem that's wrecking my head altogether! I created
a new user on my local machine, the purpose of this was to allow this user to
access certain reports through an asp.net application (to build up the
specified reports parameters, and pass them back using the soap API)
To test this I logged in as this user, and went to the report manager - but
nothing
came up! all I get is the home page with no reports! I granted "content
manager" access for this user for the group of reports when I logged in as
administrator...any ideas would be much appreciated!Do they have access to anything in the Home folder? If not then they will
see nothing. You have to give them a path to navigate to the reports they
have permission to view.
--
-Daniel
This posting is provided "AS IS" with no warranties, and confers no rights.
"Rowan Massey" <RowanMassey@.discussions.microsoft.com> wrote in message
news:7B34FA0E-AAF9-489A-A92C-A0CCB2FC6DE9@.microsoft.com...
> Hi Lads,
> I'm having this severe problem that's wrecking my head altogether! I
> created
> a new user on my local machine, the purpose of this was to allow this user
> to
> access certain reports through an asp.net application (to build up the
> specified reports parameters, and pass them back using the soap API)
> To test this I logged in as this user, and went to the report manager -
> but
> nothing
> came up! all I get is the home page with no reports! I granted "content
> manager" access for this user for the group of reports when I logged in as
> administrator...any ideas would be much appreciated!
>

Monday, March 26, 2012

Please help - data looks god, but nothing on report!

1. Created DataSet which contains a DataTable.

2. ReportViewer points to .rdlc report.

3. .rdlc report uses DataSet above.

4. Progamatically load DataTable from other tables.

5. In debug, dataset looks good, but NO data on report.

I have been working on this silly report for 4 days! Please help.

My appologies for the typo in the message subject.

Any ideas?

Tuesday, March 20, 2012

placing ssis packages on server

I have created my packages and i want them to place them on the server.Do i need to place the entire project of dts packages on the server or is there any option to place executables...if so please explain....

And to run these packages on the server do i need to set them as new job at sql server agent or is there any other way i need to run on the server.

I want then to run whenever the text file gets updated is it possible to set anything for my packages to run as and when the text file gets updated..

Please help me with all my questions

Thanks in advance..

If you you are using file based packages (.dtsx); you just need to move the files to a server where Integration services service is running. Then, yes, you can use SQL Server agent to schedule its execution. You could include some conditional logic in your package to check is the text file has been updated and then run the package in a periodically basis.|||

what or how can i check to see if file is updated...is there any task...Please help me with this..

and all my files are .dtsx files So i need to move all these files.Should the server have business intelligence studio installed or can i just take these dtsx packages and execute it through sql server agent..

Thanks

|||

ok, one thing at the time. BIDS is a development tool; it is not required in the server in order to execute the packages. The server needs to have a SSIS instance up and running; the you can use dtexec (package execution utility) to run the package via command line.

Additionally, if you want to schedule a package; you need to install the DB engine and SQL Server agent components on the server.

Regarding, how to include some logic inside of the package to check if the file has been 'updated'; I know there have been similar discussions n this forum about that; here you have a couple:

http://forums.microsoft.com/MSDN/ShowPost.aspx?PostID=1030485&SiteID=1

http://forums.microsoft.com/MSDN/ShowPost.aspx?PostID=502717&SiteID=1

|||

>>The server needs to have a SSIS instance up and running; the you can use dtexec (package execution utility) to run the package via >>command line.

Could you please tell me what ssis instance mean and about package execution utility.Sorry for very basic questions.I am not experienced with this topic.

i want my package to run automatically so i want to schedule.please let me know how to install DB engine and SQL Server agent components on the server.

Thanks

|||

Run the SQL Server setup program on the server. Choose Integration Services to get SSIS installed; choose Database Engine and Agent to get the other pieces.

Books Online has excellent help for setting up the components of SQL Server, and there are a number of walkthroughs available online if you search for "Setup SQL Server 2005".

Monday, March 12, 2012

placement of a new column

I was wondering if there is a way to add a new column to a table with
specific instruction as to where it should be created. i.e. after the 4th
column. Currently, a new column is added at the end of the table which I am
trying to avoid.
The solution I am looking for is via a script and not using the design table
option in the enterprise manager.
Many thanks
Shahriar> Currently, a new column is added at the end of the table which I am
> trying to avoid.
WHY'''''''? Can you please explain why
column order is important?
Enterprise Manager can kludge this for you, but PLEASE, PLEASE, PLEASE read
http://www.aspfaq.com/2528 before you try this on a table with anything more
than a handful of rows. You could certainly script something like what
Enterprise Manager does, but this is not using ALTER TABLE at all, and it is
a good 12kb worth of code. And as I already noted, you don't want to do
this on a large table, because it will make a complete copy of your existing
data before it's done. This can double (or more) your database size (albeit
temporarily). Plus the table is basically offline for that entire amount of
time, because you need to wrap the activity in a transaction...
This is my signature. It is a general reminder.
Please post DDL, sample data and desired results.
See http://www.aspfaq.com/5006 for info.|||Nope. The only way to do that is to do it the way EM does. Build your new
table exactly how you want it, copy all data over to it, and then drop the
old table.
"Shahriar" <Shahriar@.discussions.microsoft.com> wrote in message
news:A0C8E551-5ED0-4C32-9929-904AA088FB82@.microsoft.com...
>I was wondering if there is a way to add a new column to a table with
> specific instruction as to where it should be created. i.e. after the
> 4th
> column. Currently, a new column is added at the end of the table which I
> am
> trying to avoid.
> The solution I am looking for is via a script and not using the design
> table
> option in the enterprise manager.
> Many thanks
> Shahriar|||From ELmars post:
"o Rename table with constraints and all (use sp_rename) Don't forget
referencing foreign keys.
o Create new table definition.
o INSERT data from the old table to the new table
o Create new referencing foreign keys pointing to new table.
o Drop old table
"
HTH, Jens Sü?meyer.
http://www.sqlserver2005.de
--|||"Jens S?meyer" > From ELmars post:
> "o Rename table with constraints and all (use sp_rename) Don't forget
> referencing foreign keys.
If I do it using EM, EM will take care of foreignkey, default, etc?|||Column order is not that important but if you need to really have it in a
specific
order, you'll need to dump the existing table into a temp table, re-create
the table with
the column ordering you want, and then re-insert the rows. To me, it
seems more trouble than it's worth.
"Shahriar" <Shahriar@.discussions.microsoft.com> wrote in message
news:A0C8E551-5ED0-4C32-9929-904AA088FB82@.microsoft.com...
> I was wondering if there is a way to add a new column to a table with
> specific instruction as to where it should be created. i.e. after the
4th
> column. Currently, a new column is added at the end of the table which I
am
> trying to avoid.
> The solution I am looking for is via a script and not using the design
table
> option in the enterprise manager.
> Many thanks
> Shahriar|||
"AB - MVP" wrote:

> WHY'''''''? Can you please explain w
hy
> column order is important?
>
There are few reasons that comes in my mind, consistency is one of them.
In dealing with many tables, in my case over 180 of them, the placement
order is indeed important. Here is a little example. All my tables have a
field at the end called dateCreated. Opening EM, I can quickly locate that
column.
Second, I ususaly like to place primary keys first, then index field key
nexts, and so on.. its a practice I have been accustomed to. Third, I
believe it is a practice that all DBA's have beed adapted to..All reserved
fields are usually placed at the end of the tables and not scattered around
in a table.
I hope that was a satisfactory answer to your question.
Thanks
Shahriar|||Yes it will.
"js" wrote:

> "Jens S¨1?meyer" > From ELmars post:
> If I do it using EM, EM will take care of foreignkey, default, etc?
>
>|||Shahriar,
Relational databases are not there to satisfy the whims and fancies of a
single user. The underlying framework is based on Completeness, Generality,
Formality & Simplicity, all with its foundation on set mathematics & logic.
It is this principle-based framework that makes relational databases
superior to alternatives.
Just because you like to do something with your table management interface
in the EM, does not make it a meaningful and valid reason for the DBMS to
have such a provision. Having look-a-like tables does not mean they are
"consistent" under the above mentioned framework.
Again, that is your personal choice. Some other user might prefer a
different choice.
There is no such general practice. Note that, there are certain specific
cases where SQL assigns positional significance to the order of columns in a
table, but that has nothing to do with user preferences or general
practices.
Anith|||Anith writes in regard to the relevancy of a column position in a table :
> Relational databases are not there to satisfy the whims and fancies of a
> single user. The underlying framework is based on Completeness ,
> Generality, Formality & Simplicity, all with its foundation on set
> mathematics & logic.
DO YOUR TABLES LOOK LIKE THIS'
Fname, Address4,Zip, Telephone, MI, DOB,Lname,Address1, State,
Address2,email,Address3
Shahriar
"Anith Sen" <anith@.bizdatasolutions.com> wrote in message
news:OYMP3vkUFHA.928@.TK2MSFTNGP15.phx.gbl...
> Shahriar,
> Relational databases are not there to satisfy the whims and fancies of a
> single user. The underlying framework is based on Completeness,
> Generality, Formality & Simplicity, all with its foundation on set
> mathematics & logic. It is this principle-based framework that makes
> relational databases superior to alternatives.
>
> Just because you like to do something with your table management interface
> in the EM, does not make it a meaningful and valid reason for the DBMS to
> have such a provision. Having look-a-like tables does not mean they are
> "consistent" under the above mentioned framework.
>
> Again, that is your personal choice. Some other user might prefer a
> different choice.
>
> There is no such general practice. Note that, there are certain specific
> cases where SQL assigns positional significance to the order of columns in
> a table, but that has nothing to do with user preferences or general
> practices.
> --
> Anith
>

Friday, March 9, 2012

PL/SQL command to print a saved file in unix?

I've created and written a file to a unix directory using a stored procedure. Is there a way through that same stored procedure to send that file to a printer.

Right now I'm using lp -d 1_it_zeb overpack_lbl.txt to print the file from a command prompt. How would I send this same command to unix using my procedure?

Thanks!
CraigOne method is to create a pipe on unix that performs a specific task, say printing in your case. The following example shows a pipe that is created such that it compresses that file being written to (in this case, PL/SQL).

#--Setup the pipeline to be used in the export process
mknod $dmp_file_name p
compress < $dmp_file_name > $dmp_file_name.Z &

You should be able to formulate a pipe to send the contents to a printer.

Or, you could set up a cron job to look at files being generated and once completed, send it off to the printer.

Wednesday, March 7, 2012

PK columns dont show up in INFORMATION_SCHEMA.CONSTRAINT_COLUMN_USAGE

Having a database user 'myuser' beeing a member of the roles 'public'
and 'db_owner' I created the test table:
if exists (select * from dbo.sysobjects where id =
object_id(N'[myuser].[TEST]') and OBJECTPROPERTY(id, N'IsUserTable')
=
1)
drop table [myuser].[TEST]
GO
CREATE TABLE [myuser].[TEST] (
[TEST_ID] [varchar] (2) NOT NULL ,
[DESCRIPTION] [varchar] (60) NOT NULL ,
CONSTRAINT [TEST_PK] PRIMARY KEY CLUSTERED
(
[TEST_ID]
) ON [PRIMARY]
) ON [PRIMARY]
GO
However, the primary key constraint 'TEST_PK' does not show up in the
view
select * from INFORMATION_SCHEMA.CONSTRAINT_COLUMN_USAGE
only foreign keys (from other tables) show up.
Is this a security issue?
Using SQL Server 2000 Dev Edition SP3a on Win XP Prof.
Thank you in advance for your assistance,
SRSoenke,
This happens when a user tries to get schema information from tables
that they don't own. If you login as myuser it works fine. Can you
create the table as dbo.[TEST]? If you do this then it should work
without issue.
Mark Allison, SQL Server MVP
http://www.markallison.co.uk
Looking for a SQL Server replication book?
http://www.nwsu.com/0974973602m.html
Soenke Richardsen wrote:
> Having a database user 'myuser' beeing a member of the roles 'public'
> and 'db_owner' I created the test table:
> if exists (select * from dbo.sysobjects where id =
> object_id(N'[myuser].[TEST]') and OBJECTPROPERTY(id, N'IsUserTable
') =
> 1)
> drop table [myuser].[TEST]
> GO
> CREATE TABLE [myuser].[TEST] (
> [TEST_ID] [varchar] (2) NOT NULL ,
> [DESCRIPTION] [varchar] (60) NOT NULL ,
> CONSTRAINT [TEST_PK] PRIMARY KEY CLUSTERED
> (
> [TEST_ID]
> ) ON [PRIMARY]
> ) ON [PRIMARY]
> GO
> However, the primary key constraint 'TEST_PK' does not show up in the
> view
> select * from INFORMATION_SCHEMA.CONSTRAINT_COLUMN_USAGE
> only foreign keys (from other tables) show up.
> Is this a security issue?
> Using SQL Server 2000 Dev Edition SP3a on Win XP Prof.
> Thank you in advance for your assistance,
> SR|||Hi Mark,
during the last week I tried several times to reply to your message
using google groups, but always got a message like:
"Unable to retrieve message OQ0Dm2T$EHA.3180@.TK2MSFTNGP10.phx.gbl"
Now I found the new beta groups, and they seem to work better...
Anyway, your posting helped me, thanks!
Soenke
Mark Allison wrote:[vbcol=seagreen]
> Soenke,
> This happens when a user tries to get schema information from tables
> that they don't own. If you login as myuser it works fine. Can you
> create the table as dbo.[TEST]? If you do this then it should work
> without issue.
> --
> Mark Allison, SQL Server MVP
> http://www.markallison.co.uk
> Looking for a SQL Server replication book?
> http://www.nwsu.com/0974973602m.html
>
> Soenke Richardsen wrote:
'public'[vbcol=seagreen]
N'IsUserTable') =[vbcol=seagreen]
the[vbcol=seagreen]

PIVOT with dynamic columns names created

I am trying to do a PIVOT on a query result, but the column names created by the PIVOT function are dynamic.

For example (modified from the SQL Server 2005 Books Online documentation on the PIVOT operator) :

SELECT
Division,
[2] AS CurrentPeriod,
[1] AS PreviousPeriod
FROM
(
SELECT
Period,
Division,
Sales_Amount
FROM
Sales.SalesOrderHeader
WHERE
(
Period = @.period
OR Period = @.period - 1
)
) p
PIVOT
(
SUM (Sales_Amount)
FOR Period IN ( [2], [1] )
) AS pvt

Let's assume that any value 2 is selected for the @.period parameter, and returns the sales by division for periods 2 and 1 (2 minus 1).

Division CurrentPeriod PreviousPeriod
A 400 3000
B 400 100
C 470 300
D 800 2500
E 1000 1900

What if the value @.period were to be changed, to say period 4 and it should returns the sales for periods 4 and 3 for example, is there a way I can change to code above to still perform the PIVOT while dynamically accepting the period values 4 and 3, applying it to the columns names in the first SELECT statement and the FOR ... IN clause in the PIVOT statement ?

Need a way to represent the following [2] and [1] column names dynamically depending on the value in the @.period parameter.

[2] AS CurrentPeriod,
[1] AS PreviousPeriod

FOR Period IN ( [2], [1] )

I have tried to use the @.period but it doesn't work.

Thanks in advance.

Kenny

This is a one drawback to the current Pivot feature. You will have to use dynamic sql for this.

Itzik has written a good article on this.

http://www.sqlmag.com/Article/ArticleID/94268/sql_server_94268.html

Saturday, February 25, 2012

Pivot Tables in Excel

I Created pivot tables and pivot charts in excel getting data from an external source (SQL Server 2000) using an ODBC Connection. The problem is when i want to use the same spreadsheet acceess the server from outside through the internet using an ODBC connection.

The odbc which access the sql server remotely uses as server name the ip address of the server than sql server running to.

The pivot tables when it was created it used the odbc connection accessed from with in the LAN (Server name /SERVERGROUP/SQLSERVER).

I was just wondering if there is an easy way to modify the code that the pivot tables uses to make the connection to the SQL Server with out
recreating all those pivot tables and pivot charts.

When I try to open the file remotely it fails to make a connection of course.

Any help is appreciated!

ThanksIf you are using an Excel Macro to do your queries, you are in luck. All you need to do is open the VB Editor and edit the connection string.

Otherwise, you will most likely have to recreate your queries (but if you do so, create a Macro, so you can edit it moreeasily in the future). All you need to do it turn on the Macro Recorder, build your queries as you did before, even perform any formatting, then stop the recorder. Now, if you ever need to tweek the query or change the datasource, it's simply a matter of editing the Marco code.

-b|||well thanks i knew that I could do that but I have over 20 graphs and 10 reports so it would be nice if there is a way for a quick and dirty solution. Saying all that if there is not a way to access the code that excel creates on the background when you use those pivot table wizards so I could manually modify the connection string then I guess I need to redo it from scrach and use macros this time.

Thanks for your input

Pivot Table with SSAS 2005

hi

i facing problem when i tring to browse cube created in SSAS 2005 from pivot table in FP appear this error

The query could not be processed:

o An error was encountered in the transport layer.

o The peer prematurely closed the connection.

any one have solutions for this problem

thanks

Ensure that you have the Microsoft OLE DB Provider for Analysis Services 9.0 installed, as well as Microsoft Core XML Services 6.0. They can be downloaded here

http://www.microsoft.com/downloads/details.aspx?FamilyID=df0ba5aa-b4bd-4705-aa0a-b477ba72a9cb&DisplayLang=en

Once they are both installed, run regedit and look under HKEY_CLASSES_ROOT/MSOLAP - there should be a key called CLSID which should have the same value as the CLSID under HKEY_CLASSES_ROOT/MSOLAP.3

Now try to connect to the SSAS cube again...some people have found they need to specify their username as <Domain>\<Username> to connect correctly