Monday, February 27, 2012

HELP - I NEED EXTRA MATRIX COLUMNS

I need to be able to add an extra column to my matrix - I've searched high and low on the net and I cant seem to find the answer anywhere.

I have grouped data which displays as it should and I can get the SubTotal part for my Dailys to run - I need to add a total for MTD and YTD (which is a part of the SQL Data) after the subtotal - is there a straight forward way of doing this?

This is an example of how I want my report to look (I can already achieve the bits highlighted bold)

Daily Data Commission (GroupData cell 1) Interest (GroupData CELL 2) GroupData etc … SUBTOTAL (DTD) MTD TOTAL YTD TOTAL Book Currency =sum(Fields!DTD.VALUE) ........ =sum(Fields!MTD.VALUE) =sum(Fields!YTD.VALUE)

and this is an example of the data being returned by my Sproc

Book CCY GroupedData DTD MTD YTD ABC GBP Commission £0.01 £0.09 -£0.10 ABC GBP Interest £0.02 £0.29 £0.11 ABC GBP Brokerage £0.12 £0.06 £0.20

When I use the "Add Column" functionality - I get a repeat of the MTD and YTD under each of the groupData cells - where as i only need it as a summary after the subtotal.

Daily Data Grouped Data CELL 1 Grouped Data CELL 2 Book Name Currency DTD TOTAL MTD TOTAL YTD TOTAL DTD TOTAL MTD TOTAL YTD TOTAL =sum(Fields!DTD.VALUE)

Does anyone out there know how to do it?

This SUCKS.... I cant believe no one knows this and there are no Microsoft developers monitoring this thread to answer the questions.

|||

Hi there,

Did you try doing subtotals and totals within SQL Server instead of in the matrix? This should get the extra columns you would need.

Alternatively you could create a matrix next to the matrix that may work.

Or see this posting.

http://www.sqlskills.com/blogs/liz/2006/07/21/ReportingServicesGettingTheMatrixToDisplayTwoSubtotalsForTheSameGroup.aspx

cheers,

Andrew

|||

Thanks Andrew -

For anyone who is reading this thread that isn't sure where the Inscope expression goes (like me) it is in the

"Data" bit of the matrix (where you have your =Sum(field!...) bit)

The only thing now is - how do I sort the positioning out. My design layout looks like below - I have a DTD BreakDown and Total (I have used the <position> element in the XML code behind to add "DTD Total" position to the front) - I can get the MTD to be on the same level as the DTD Parent Cell as needed - however The YTD subtotal is one level higher (I tried messing about with alignment - which sorts the problem temp - but on expanding the DTD breakdown the alignment doesnt work)

How can I make the YTD (Text) appear on the same level as DTD and MTD?

YTD DTD MTD Fields!DTDBreakDown DTD Total Field!Book.Value Field!CCY.Value iif(InScope("YTD"),
iif(InScope("MTD"),
Sum(Fields!AmountCCY_DTD.Value),
sum(Fields!AmountCCY_MTD.Value)),
sum(Fields!AmountCCY_YTD.Value))

e.g of how my report looks - I need the YTD to be on the same level as DTD and MTD

YTD (+)DTD MTD DTD Total Book 1 CCY 0 0 0

|||This is not possible in the Matrix report. You need to return you column from the SP or TSQL that you are using and instead of using Matrix create this report from using table report.|||

Are you sure? According to the example given in the URL on Andrew's Post(http://www.sqlskills.com/blogs/liz/2006/07/21/ReportingServicesGettingTheMatrixToDisplayTwoSubtotalsForTheSameGroup.aspx) says you can. However, its just doesn't explain how.

I need to use a matrix as the DTD breakdown is dynamic - and I'm almost there, the problem I have is cosmetic.

|||If you want to create the report as given in above URL then you can do it. Are you want to do like that ? and steps are clearly added there. Any other problem in that then let me know.

Help - I need a book on SQL

Im trying to write a complicated SQL in analyser using several tables and su
b
selects to do some inserts.
I have to do this very often and I only have a limited knowledge of SQL.
Can you recommend a book I can go out and buy that will teach me how to
write involved SQLs for a beginner please.
(The name and author would be great).
Thanks, and frustrated!If there is one book that I would get this would be it
The Guru's Guide to Transact-SQL
by Ken Henderson
http://www.amazon.com/exec/obidos/t...=glance&s=books
and BOL of course
http://sqlservercode.blogspot.com/|||Very good book for writing queries, specifically (as opposed to db design):
"SQL Queries For Mere Mortals" by Hernandez and Viescas
http://www.amazon.com/gp/product/02...glance&n=283155
NOTE: Celko wrote the Forward to this book - AND the book refers to columns
as fields!!! LOL!!!
"JD" <JD@.discussions.microsoft.com> wrote in message
news:ACA8DD1C-E62E-4AA3-910F-82EEEA6CC2F7@.microsoft.com...
> Im trying to write a complicated SQL in analyser using several tables and
> sub
> selects to do some inserts.
> I have to do this very often and I only have a limited knowledge of SQL.
> Can you recommend a book I can go out and buy that will teach me how to
> write involved SQLs for a beginner please.
> (The name and author would be great).
> Thanks, and frustrated!
>

Help - I can't shut off Remote Connections

Hello all and thanks for any help...
I have just looked into the Event Viewer on our leased Windows Server
running Windows Server 2003 Enterprise Edition and SQL Server 2005 Standard.
I was shocked to see numerous attempts over the last 3 days (hundreds of
them) trying to log into our SQL Server. Unfortunately only the Failed
attempts were logged.
I was under the impression that be default, SQL Server 2005 is installed to
NOT allow remote connections, which I interpret as connections over the
public internet.
When I looked into Properties/Connections for the server instance I was
again shocked to see that "Allow remote connections" was checked. So I tried
logging into our SQL Server via the internet and got in immediately.
I then unchecked the option to allow remote connections and restarted the
server.
Apparently I don't understand what that option means, because I can still
log in via the internet using the public IP address of the Web Server.
I can only shut the SQL Server down for a short period of time.
Can somebody tell me what option I must configure to stop the ability to log
in via the public internet?
Thanks to all...John
By default SQL Server is listening to 1433 port.You can change it to prevent
(among with other thing you have already mentioned) to log into from the
internet
"John Kotuby" <JohnKotuby@.discussions.microsoft.com> wrote in message
news:uvTki550HHA.4712@.TK2MSFTNGP04.phx.gbl...
> Hello all and thanks for any help...
> I have just looked into the Event Viewer on our leased Windows Server
> running Windows Server 2003 Enterprise Edition and SQL Server 2005
> Standard. I was shocked to see numerous attempts over the last 3 days
> (hundreds of them) trying to log into our SQL Server. Unfortunately only
> the Failed attempts were logged.
> I was under the impression that be default, SQL Server 2005 is installed
> to NOT allow remote connections, which I interpret as connections over the
> public internet.
> When I looked into Properties/Connections for the server instance I was
> again shocked to see that "Allow remote connections" was checked. So I
> tried logging into our SQL Server via the internet and got in immediately.
> I then unchecked the option to allow remote connections and restarted the
> server.
> Apparently I don't understand what that option means, because I can still
> log in via the internet using the public IP address of the Web Server.
> I can only shut the SQL Server down for a short period of time.
> Can somebody tell me what option I must configure to stop the ability to
> log in via the public internet?
> Thanks to all...
>

Help - How to validate parameters?

I need to validate a parameters Length to make sure it is 8 characters long
and whether it is of numeric value and if it is not to display a validation
error message. Is this possible. I'm using reporting service and SQL Server
2000.
Any help will greatly appreciated.Have you tried placing custom code to validate these, then placing a dummy
textbox at the top of the report to call the validate code?
Eg. =Code.ValidateParameters()
Get this to show a box or something if it fails.
"Ken" <Ken@.discussions.microsoft.com> wrote in message
news:D2458D65-E2C3-4C80-951E-D44B389929AA@.microsoft.com...
>I need to validate a parameters Length to make sure it is 8 characters long
> and whether it is of numeric value and if it is not to display a
> validation
> error message. Is this possible. I'm using reporting service and SQL
> Server
> 2000.
> Any help will greatly appreciated.|||Hi, Craig,
my code looks like this:
declare @.lastmonth char(7)
@.lastmonth = code.mycode() <-- will return '2006.02'
select comments from theTable
where yearmonth = @.lastmonth
I execute it and get an error "syntax error near @.lastmonth"
the program of mycode() is in layout, report, report properties, code.
Thanks!
Henry
"Ken" wrote:
> I need to validate a parameters Length to make sure it is 8 characters long
> and whether it is of numeric value and if it is not to display a validation
> error message. Is this possible. I'm using reporting service and SQL Server
> 2000.
> Any help will greatly appreciated.|||Well for starters you can't put multiple lines in the dataset. If you were
doing this you can only do:
="select comments from theTable where yearmonth = "& Code.mycode()
Which will return the SQL statement with the date included.
I thought the original question though was how to validate the parameters?
I was thinking of putting a Testbox on the form which has a value of
=Code.ValidateParameter(Parameters!DateParam.Value)
as an example. Then you can make this visible if the text box has anything
in it.
You can use the same syntax for the SQL statement if you wanted to check the
parameter before you passed it in to the database within the dataset.
Craig
"Henry Chen" <HenryChen@.discussions.microsoft.com> wrote in message
news:7F512BB0-8D5C-4E16-B1E2-6A0270BB0117@.microsoft.com...
> Hi, Craig,
> my code looks like this:
> declare @.lastmonth char(7)
> @.lastmonth = code.mycode() <-- will return '2006.02'
> select comments from theTable
> where yearmonth = @.lastmonth
> I execute it and get an error "syntax error near @.lastmonth"
> the program of mycode() is in layout, report, report properties, code.
> Thanks!
> Henry
>
> "Ken" wrote:
>> I need to validate a parameters Length to make sure it is 8 characters
>> long
>> and whether it is of numeric value and if it is not to display a
>> validation
>> error message. Is this possible. I'm using reporting service and SQL
>> Server
>> 2000.
>> Any help will greatly appreciated.

HELP - How to specify no credentials programatically?

When we use the ReportingService2005.CreateReport () method to publish an RDL
file to the server (from a C# client application we've developed), there
doesn't seem to be a way to specify within the RDL that "No credentials are
required" for the datasource it uses. Nor does the CreateReport() method
accept a parameter to specify this. The system seems to default to
"Credentials Supplied by User" which is not what we need.
The datasource definition is internal to the RDL file (i.e. not a shared
one).
Does anyone know how to do this?
thanks in advance,
MichaelYou will need to call GetItemDataSources, modify the data source accordingly
and then call SetItemDataSources.
--
-Daniel
This posting is provided "AS IS" with no warranties, and confers no rights.
"MichaelM" <MichaelM@.discussions.microsoft.com> wrote in message
news:50C78B6E-EBC5-4D16-A207-21E75FC42A12@.microsoft.com...
> When we use the ReportingService2005.CreateReport () method to publish an
> RDL
> file to the server (from a C# client application we've developed), there
> doesn't seem to be a way to specify within the RDL that "No credentials
> are
> required" for the datasource it uses. Nor does the CreateReport() method
> accept a parameter to specify this. The system seems to default to
> "Credentials Supplied by User" which is not what we need.
> The datasource definition is internal to the RDL file (i.e. not a shared
> one).
> Does anyone know how to do this?
> thanks in advance,
> Michael

Help - how to select from a comma delimited field ?

Dear SQL,

Lets say I have a field: User_Names

and it can have more than one name
example: "Yovav,John,Shon"

How can I select all the records that has "Yovav" in them ?

I try some like this:
SELECT * FROM User_Table WHERE User_Names IN ('Yovav')

but it only works if the field User_Names only has "Yovav" in it with no extras...

is there some SQL function to tell if string is found in a field ?

Any hope 4 me ?Try

SELECT * FROM User_Table WHERE User_Names LIKE '%Yovav%'|||Thank Q 4 lightning up my night in such times of overSQLing...

finally I can have some sleep :-)

and Thank GOD 4 this forum !!!

Help - how to open Report in new browser window?

Hello...
I am struggling with this...but I am getting close.
I am trying to use the Web Service programming interface from my ASP.Net
application. What I am trying to accomplish...is to allow the user to set
some report parameters (date ranges, zip codes, etc). When the user presses
the Run button on my ASP.Net page, I would like to render the report in a new
browser window/page, and with the Toolbar so the user could export to
whatever format that they want. Is this possible?
Thanks for any help...
- DW
The code I am using now is the following..
Dim rptIRIS As Byte() = Nothing
Dim rsIRIS As Secure_SQLRS.ReportingService = New
Secure_SQLRS.ReportingService
rsIRIS.Credentials = New System.Net.NetworkCredential("ME", "Pwd",
"Domain")
rsIRIS.PreAuthenticate = True
Dim rptPath As String = "/WebReports/UserLabels"
Dim rptFormat As String = "HTML4.0"
Dim rptViewerInfo As String = "<DeviceInfo>" & _
"<Toolbar>true</Toolbar>" & _
"<Parameters>false</Parameters>" & _
"<DocMap>true</DocMap>" & _
"<Zoom>100</Zoom>" & _
"</DeviceInfo>"
Dim rptParams(1) As Secure_SQLRS.ParameterValue
Dim paramValue As Secure_SQLRS.ParameterValue = New
Secure_SQLRS.ParameterValue
paramValue.Name = "ProductID"
paramValue.Value = "102"
rptParams(0) = paramValue
paramValue = New Secure_SQLRS.ParameterValue
paramValue.Name = "ZipCode"
paramValue.Value = "12345"
rptParams(1) = paramValue
Dim rptHistoryID As String = Nothing
Dim credentials() As Secure_SQLRS.DataSourceCredentials = Nothing
Dim showHideToggle As String = Nothing
Dim encoding As String
Dim mimeType As String
Dim warnings() As Secure_SQLRS.Warning = Nothing
Dim rptHistoryParams() As Secure_SQLRS.ParameterValue = Nothing
Dim streamIDs() As String = Nothing
Dim rptSessionHdr As Secure_SQLRS.SessionHeader = New
Secure_SQLRS.SessionHeader
rsIRIS.SessionHeaderValue = rptSessionHdr
Try
' execute the report
rptIRIS = rsIRIS.Render(rptPath, rptFormat, rptHistoryID,
rptViewerInfo, rptParams, credentials, _
showHideToggle, encoding, mimeType, rptHistoryParams,
warnings, streamIDs)
rptSessionHdr.SessionId = rsIRIS.SessionHeaderValue.SessionId
Response.Clear()
' set the http headers for the PDF response
HttpContext.Current.Response.ClearHeaders()
HttpContext.Current.Response.ClearContent()
HttpContext.Current.Response.ContentType = "text/html"
HttpContext.Current.Response.AppendHeader("Content-Disposition", _
"filename=""AddressLabels.HTM""")
' send the byte array containing the report as a binary response
HttpContext.Current.Response.BinaryWrite(rptIRIS)
HttpContext.Current.Response.End()
Catch ex As Exception
Finally
End TryYou won't be able to accomplish this with straight SOAP interfaces. You will
have to use a combination of Web service calls to list the reports in the
namespace and then a URL request to the server that includes the
parameterized URL for the report. You can certainly open it in a new browser
window under those conditions.
--
Bryan Keller
Developer Documentation
SQL Server Reporting Services
A friendly reminder that this posting is provided "AS IS" with no
warranties, and confers no rights.
"dw" <dw@.discussions.microsoft.com> wrote in message
news:08CC0691-B2A2-4F4C-A1DA-7DAB23E67F20@.microsoft.com...
> Hello...
> I am struggling with this...but I am getting close.
> I am trying to use the Web Service programming interface from my ASP.Net
> application. What I am trying to accomplish...is to allow the user to set
> some report parameters (date ranges, zip codes, etc). When the user
presses
> the Run button on my ASP.Net page, I would like to render the report in a
new
> browser window/page, and with the Toolbar so the user could export to
> whatever format that they want. Is this possible?
> Thanks for any help...
> - DW
> The code I am using now is the following..
> Dim rptIRIS As Byte() = Nothing
> Dim rsIRIS As Secure_SQLRS.ReportingService = New
> Secure_SQLRS.ReportingService
> rsIRIS.Credentials = New System.Net.NetworkCredential("ME", "Pwd",
> "Domain")
> rsIRIS.PreAuthenticate = True
> Dim rptPath As String = "/WebReports/UserLabels"
> Dim rptFormat As String = "HTML4.0"
> Dim rptViewerInfo As String = "<DeviceInfo>" & _
> "<Toolbar>true</Toolbar>" & _
> "<Parameters>false</Parameters>" & _
> "<DocMap>true</DocMap>" & _
> "<Zoom>100</Zoom>" & _
> "</DeviceInfo>"
>
> Dim rptParams(1) As Secure_SQLRS.ParameterValue
> Dim paramValue As Secure_SQLRS.ParameterValue = New
> Secure_SQLRS.ParameterValue
> paramValue.Name = "ProductID"
> paramValue.Value = "102"
> rptParams(0) = paramValue
> paramValue = New Secure_SQLRS.ParameterValue
> paramValue.Name = "ZipCode"
> paramValue.Value = "12345"
> rptParams(1) = paramValue
> Dim rptHistoryID As String = Nothing
> Dim credentials() As Secure_SQLRS.DataSourceCredentials = Nothing
> Dim showHideToggle As String = Nothing
> Dim encoding As String
> Dim mimeType As String
> Dim warnings() As Secure_SQLRS.Warning = Nothing
> Dim rptHistoryParams() As Secure_SQLRS.ParameterValue = Nothing
> Dim streamIDs() As String = Nothing
> Dim rptSessionHdr As Secure_SQLRS.SessionHeader = New
> Secure_SQLRS.SessionHeader
> rsIRIS.SessionHeaderValue = rptSessionHdr
> Try
> ' execute the report
> rptIRIS = rsIRIS.Render(rptPath, rptFormat, rptHistoryID,
> rptViewerInfo, rptParams, credentials, _
> showHideToggle, encoding, mimeType, rptHistoryParams,
> warnings, streamIDs)
> rptSessionHdr.SessionId = rsIRIS.SessionHeaderValue.SessionId
> Response.Clear()
> ' set the http headers for the PDF response
> HttpContext.Current.Response.ClearHeaders()
> HttpContext.Current.Response.ClearContent()
> HttpContext.Current.Response.ContentType = "text/html"
> HttpContext.Current.Response.AppendHeader("Content-Disposition", _
> "filename=""AddressLabels.HTM""")
> ' send the byte array containing the report as a binary response
> HttpContext.Current.Response.BinaryWrite(rptIRIS)
> HttpContext.Current.Response.End()
> Catch ex As Exception
> Finally
> End Try

Help - how to execute an sp using linked server

When I run from Main server which has linked server connection - sp executes but the resultset does not get displayed. When I call the same proc from SSRS - I am getting 'an error occured when retrieving parameters for the query. sp does not exist'

Thanks

you will have a tab called 'Command Type' there you change it to Stored Procedure. Then try to execute also make sure you have declared all the parameters before executing this.

Hope it solves your problem.

Help - how to concatinate strings from multiple rows?

I have a need to concatenate all Descriptions from a select statement

SELECT
t_ReviewSection.PeerRevSectionDescription
FROM
t_ReviewSection
WHERE
t_ReviewSection.PeerRevID = @.lngRevID
ORDER BY
t_ReviewSection.PeerRevSectionOrder

I want to return a single string "section1, section2, section3, section4"
based on the multiple rows returned.

Any ideasJerry (jerryg_no_spam@.ptd.net) writes:
> I have a need to concatenate all Descriptions from a select statement
>
> SELECT
> t_ReviewSection.PeerRevSectionDescription
> FROM
> t_ReviewSection
> WHERE
> t_ReviewSection.PeerRevID = @.lngRevID
> ORDER BY
> t_ReviewSection.PeerRevSectionOrder
>
> I want to return a single string "section1, section2, section3, section4"
> based on the multiple rows returned.

There is unfortunately no safe way to do this with a single SELECT
statement. The only safe way is to iterate over the data in a cursor
and concatenate to a variable.

It may be better to just get the data from SQL Server and then concatenate
in the client.

--
Erland Sommarskog, SQL Server MVP, sommar@.algonet.se

Books Online for SQL Server SP3 at
http://www.microsoft.com/sql/techin.../2000/books.asp|||Thanks - I did it in code as you suggested. I always have that to fall back
on but you know how it is. You try to do everyting in the Sproc if you can
and I'm not nearly as talented in Sprocs as I am in VB. Figured maybe I was
missing something.

Thanks

"Erland Sommarskog" <sommar@.algonet.se> wrote in message
news:Xns94EE134E26DDYazorman@.127.0.0.1...
> Jerry (jerryg_no_spam@.ptd.net) writes:
> > I have a need to concatenate all Descriptions from a select statement
> > SELECT
> > t_ReviewSection.PeerRevSectionDescription
> > FROM
> > t_ReviewSection
> > WHERE
> > t_ReviewSection.PeerRevID = @.lngRevID
> > ORDER BY
> > t_ReviewSection.PeerRevSectionOrder
> > I want to return a single string "section1, section2, section3,
section4"
> > based on the multiple rows returned.
> There is unfortunately no safe way to do this with a single SELECT
> statement. The only safe way is to iterate over the data in a cursor
> and concatenate to a variable.
> It may be better to just get the data from SQL Server and then concatenate
> in the client.
>
> --
> Erland Sommarskog, SQL Server MVP, sommar@.algonet.se
> Books Online for SQL Server SP3 at
> http://www.microsoft.com/sql/techin.../2000/books.asp|||Jerry (jerryg_no_spam@.ptd.net) writes:
> Thanks - I did it in code as you suggested. I always have that to fall
> back on but you know how it is. You try to do everyting in the Sproc
> if you can and I'm not nearly as talented in Sprocs as I am in VB.
> Figured maybe I was missing something.

What to do in application code and what to do in SQL may not always
be obvious. But as a general rule of thumb, SQL is good for raw data
retrieval, and also business logic and also computations to some degree.
However, string handling and formatting is poor in SQL.

--
Erland Sommarskog, SQL Server MVP, sommar@.algonet.se

Books Online for SQL Server SP3 at
http://www.microsoft.com/sql/techin.../2000/books.asp|||Concatenation of data in different rows can be done without using a cursor..
try this out...
DECLARE @.desc VARCHAR(1000)
SELECT @.desc =@.desc + ', ' +
t_ReviewSection.PeerRevSectionDescription
FROM
t_ReviewSection
WHERE
t_ReviewSection.PeerRevID = @.lngRevID
ORDER BY
t_ReviewSection.PeerRevSectionOrder
SET @.Desc=substring(@.Desc,3,len(@.Desc))

Erland Sommarskog <sommar@.algonet.se> wrote in message news:<Xns94EEF2D071E89Yazorman@.127.0.0.1>...
> Jerry (jerryg_no_spam@.ptd.net) writes:
> > Thanks - I did it in code as you suggested. I always have that to fall
> > back on but you know how it is. You try to do everyting in the Sproc
> > if you can and I'm not nearly as talented in Sprocs as I am in VB.
> > Figured maybe I was missing something.
> What to do in application code and what to do in SQL may not always
> be obvious. But as a general rule of thumb, SQL is good for raw data
> retrieval, and also business logic and also computations to some degree.
> However, string handling and formatting is poor in SQL.|||JK (jaikrishnan_nair@.hotmail.com) writes:
> Concatenation of data in different rows can be done without using a
> cursor.. try this out...
> DECLARE @.desc VARCHAR(1000)
> SELECT @.desc =@.desc + ', ' +
> t_ReviewSection.PeerRevSectionDescription
> FROM
> t_ReviewSection
> WHERE
> t_ReviewSection.PeerRevID = @.lngRevID
> ORDER BY
> t_ReviewSection.PeerRevSectionOrder
> SET @.Desc=substring(@.Desc,3,len(@.Desc))

But it is not realiable. The result of the above operation is undefined,
so you may what you expect, or you may get something else.

See http://support.microsoft.com/default.aspx?scid=287515.

--
Erland Sommarskog, SQL Server MVP, sommar@.algonet.se

Books Online for SQL Server SP3 at
http://www.microsoft.com/sql/techin.../2000/books.asp

Help -- How do I exit a Foreach Loop

I have a foreach loop that is processing files in a directory. One of the tasks is the execute sql task that calls a stored proc. If this stored proc returns a certain value then I need to exit the loop and not process any more files. But I have been unable to figure out how to make this happen.

Any help would be greatly appreciated.

Thanks!

GN

Hi,

I am not sure whether this will work or not. But just try. In the 'Foreach loop' container, select 'Expressions' tab and assign a boolean variable to the propery 'Disable'. Then based on the result, when required, set the value of the variable to 'True'. If it does not work, try this option for 'ForceExectionResult' also.

Let me know if any one of it works.

|||

GN wrote:

I have a foreach loop that is processing files in a directory. One of the tasks is the execute sql task that calls a stored proc. If this stored proc returns a certain value then I need to exit the loop and not process any more files. But I have been unable to figure out how to make this happen.

Any help would be greatly appreciated.

Thanks!

GN

have you considered using the script task instead?|||

Place the workflow in a sequence inside the ForEach Loop.

Place a script task at the beginning of the loop with a precedence constraint going to the sequence.

Place an expression on the precedence constraint to only execute if a variable is true, call the variable, "ExecuteBody". ExecuteBody==true

Set op to expression only

When the loop should terminate, set the ExecuteBody variable to false. The loop will traverse the entire remaining collection, but because there is nothing to execute, it will rip through it extremely fast and terminate.

K

Help - How do I create a new instance

Hi Guys

Im new to SQL and I need to create a new instance ?

Help

Do you already have a instance ? Which SQL Server version are you using `SQL Server Express or any Full featured version ?

HTH, Jens K. Suessmeyer.

http://www.sqlserver2005.de|||

hi,

you need to install (and not create) a new instance if no other instance has been already installed..

If you install a SQLExpress instance, by default a named instance named SQLExpress will be installed... it's actual full name will be <ComputerName>\SQLExpress

to do that, just run the SQLExpress installer.. if you like to "play" with the advanced options, you can modify the instance name (or install a default instance as well, if not already installed), if remote connections will be enabled (network protocols are disabled by default), modify (if required) the collation settings for sort order and comparations, allow standard SQL Server logins instead of Windows authenticated connection only and the like...

regards

Help - Formulas with Functions across Databases

I'm trying to build a table that uses a formula with a function in another database on the same server group. Enterprise Manager validates the formula, then when I save the table, an ODBC error pops up stating that the function is an invalid object name. I'm calling the function in the formula as follows: ([OtherDatabase].[dbo].[FunctionName](parameters)). Any ideas?

-- JakeAre you running at least sp2 ?

-PatP|||Yes, I'm running SP2.

-- Jake|||I know that certain situations do force you to use cross-database DML. But if you're creating a calculated field (is that what you're trying to do?) with a function in another database this means that both databases, AND THEIR NAMES (!!!) must go together, right? Why don't you move the function to the same database as the table? And if the function does reference objects in another database, then it would be transparent for the table. If you want to apply "reusability" and you happen to have a table(-s) in another database that has a calculated field that utilizes the same function, then you can just reference that function within the function wrapper.|||The Database I'm working on is a template that will be replicated for numerous individual projects, so I'm trying to reduce the number of locations the generalized Function code will be stored in. Otherwise, I was curious why I can put ([OtherDatabase].[dbo].[FunctionName](parameters)) in a View (for example) and get the desired result, yet have it not recognize the object when said code is placed in the Formula field (even after it supposedly validates the code). If I'm missing something syntax-wise, please let me know.

Help - Finding the newest date without using max()

Hi,

Can anyone help please?

select notefield, modifiedon
FROM Table1
WHERE id = '100426' and
(statusfield like '%criteria1%' OR
statusfield like '%criteria2%')

Produces a list of records based upon the criteria. I would like to be
able to only show the newest dated record, from the modifiedon field.

I've tried max(modified) on, but as I am using an aggregate function in
the query I have to use GROUP BY, which notefield does not like as this
is a ntext field.

(I get:

Server: Msg 306, Level 16, State 2, Line 1
The text, ntext, and image data types cannot be compared or sorted,
except when using IS NULL or LIKE operator.
)

Any ideas please?select TOP 1 notefield, modifiedon
FROM Table1
WHERE id = '100426' and
(statusfield like '%criteria1%' OR
statusfield like '%criteria2%')
ORDER BY modifiedon DESC

--------
Alexander Kuznetsov
http://sqlserver-tips.blogspot.com/

HELP-- execution dts in transaction -- URGENT

please I need to know what is required to properly execute a dts with steps joined to transactions in it.

i know MSDTC must be running on server but it must be running in local machine too?

user running dts must be sysadmin? is it required ?

any other requirements needed?

please write down a full list of required items or a BOL reference to look forMaybe if you tell us what you're trying to do?|||what i want is to know what are the requirements to execute a dts with steps joined to transactions in it. I talk about requirements related to services, user privileges and so on

I want to know this info that applied for a general problem (that is how to execute a transactional dts).

Anyway i will tell you what is my specific problem related to it. I have a DTS who loops over files in a directory processing each file (that is inserting rows in several tables of a sql server database) in a way that all files must be processed or if one is not processed due to an error (logged to a log table) then all previous files must be deprocessed (changes made to database must be 'rollback'ed). Thus transacctions are needed into the DTS for some steps (those who change database) while others mustn't (those who perform tests and just write to log table)

This is so easy to do in a DTS but i have experienced several problems due to transacctions. first, with MSDTC service running in server machine (where SQLServer is running) I cannot execute successfully the dts unless MSDTC also running in my local machine. Also is not possible for my user to execute DTS from dts editor unless my user being sysadmin in SQLServer server machine. So i want to know what is exactly necessary to execute such kind of DTS (transactional DTS) in terms of services, security, user profiles and roles, etc

Help - Error: Unable to open the physical file

Hi, I've just finished creating my first ASP.NET website. I created it on my laptop and it works perfectly, but I'm having some problems deploying it to my web server (PC).
The site works ok, but when I try to log in to my client area, I get this error:

"Unable to open the physical file "c:\inetpub\wwwroot\ONeillStrata\App_Data\aspnetdb.mdf". Operating system error 5: "5(Access is denied.)".
An attempt to attach an auto-named database for file c:\inetpub\wwwroot\ONeillStrata\App_Data\aspnetdb.mdf failed. A database with the same name exists, or specified file cannot be opened, or it is located on UNC share."

I've searched far and wide for a solution and have read many articles, but none seem to be able to fix my problem. I've tried using that SEUtil tool, but that didn't work, I've made sure the App_Data directory on the web server isn't read only (the read-only checkbox isn't ticked, but it DOES have that little green square which I can't get rid of), I've assigned the ASPNET user as a db owner and that didn't work and I've manually attached the database to the web server's instance of SQL Express 2005.
When I launch the solution in VS2005 on the web server, I can browse through the database and see the tables and data etc, but when I try to run the application, I get that message above...

Surely it shouldn't be this hard to get it going?
Any help would be massively appreciated.

Hi,

It seems that the physical file location of the aspnetdb.mdf is wrong.

Please open your IIS, right click on your default web site,and click on the properties, change to the ASP.NET tab, edit configurations.Then you can see the LocalSqlServer string listed there, change the string into the following:data source=.\SQLEXPRESS;Integrated Security=SSPI;AttachDBFilename=|DataDirectory|aspnetdb.mdf;User Instance=true.

Thanks.

Help - Error when modifying dataset

I keep getting this error when I go to Preview the report. I have changed
the code for the report. then I pasted my new code in and adjusted all the
fields in the layout to respond to the new fields. But I can't get it to
Preview - I keep getting this error message.
The SortExpresssion expression for the table â'table1â' refers to the field
â'projectâ'. Report item expressions can only refer to fields within the
current data set scope or, if inside an aggregate, the specified data set
scope
Can anyone tell me how to correct this information to use my new code?
Thank you
Normad"NormaD" wrote:
> I keep getting this error when I go to Preview the report. I have changed
> the code for the report. then I pasted my new code in and adjusted all the
> fields in the layout to respond to the new fields. But I can't get it to
> Preview - I keep getting this error message.
> The SortExpresssion expression for the table â'table1â' refers to the field
> â'projectâ'. Report item expressions can only refer to fields within the
> current data set scope or, if inside an aggregate, the specified data set
> scope
> Can anyone tell me how to correct this information to use my new code?
> Thank you
> Normad
Thanks to any of you who were reading this to respond. I went into the XML
and did the changes.
normad

HELP - Error Message when running DBCC Showcontig with TableResults

Has anyone encountered the following error message when
running DBCC Showcontig with tableresults?
[Microsoft][ODBC SQL Server Driver]Unknown token received
from SQL Server
I am doing a dynamic DBCC on every table in my DB to
perform some analysis of indexes, however when I run the
full DB, I get this error. I printed the table.index for
each index prior to running the dbcc and when I take the
table and index and run independantly in ISQL, it executes
fine, but gives me a message about:
'DBCC execution completed. If DBCC printed error messages,
contact your system administrator.'
Which is not viable since it is a text string and not the
parameters of the TABLERESULTS table.
Anyone have a workaround or know what I can do to fix this?I'd start checking the database using CHECKDB. If that comes out clean, I'd say you've hit a bug
(check KB, be current and if that doesn't help open case with MS PSS).
--
Tibor Karaszi, SQL Server MVP
Archive at: http://groups.google.com/groups?oi=djq&as ugroup=microsoft.public.sqlserver
"Chris" <chris@.da-krewe.com> wrote in message news:066a01c391af$9183bca0$a101280a@.phx.gbl...
> Has anyone encountered the following error message when
> running DBCC Showcontig with tableresults?
> [Microsoft][ODBC SQL Server Driver]Unknown token received
> from SQL Server
> I am doing a dynamic DBCC on every table in my DB to
> perform some analysis of indexes, however when I run the
> full DB, I get this error. I printed the table.index for
> each index prior to running the dbcc and when I take the
> table and index and run independantly in ISQL, it executes
> fine, but gives me a message about:
> 'DBCC execution completed. If DBCC printed error messages,
> contact your system administrator.'
> Which is not viable since it is a text string and not the
> parameters of the TABLERESULTS table.
> Anyone have a workaround or know what I can do to fix this?

Help - Error deleting publication or adding subscription

In the haste of trying to delete a publication/subscription combo, it looks
as if it's stuck in limbo. I must have done something out of sequence.
In Microsoft SQL Server Management Studio it appears the subscription on the
secondary server has been deleted yet I can't delete the publication as it
says it can't delete the subscription. When I try to add the subscription on
the second server it says it already exists, even though It doesn't list
itself under subscriptions.
I'm trying to remove all replication and I'm wondering if it needs to be
deleted manually through the database as something is out of sync.
Thanks
Ron
If this is a push subscription issue the following
sp_dropsubscription @.publication= 'PublicationName', @.article= 'ALL',
@.subscriber= 'SubscriberName'
, @.destination_db= 'SubscriptionDatabase', @.ignore_distributor = 0If this
does not clear it up do thissp_dropsubscription @.publication=
'PublicationName', @.article= 'ALL', @.subscriber= 'SubscriberName'
, @.destination_db= 'SubscriptionDatabase', @.ignore_distributor = 1Then
check your subscription database. See if there are entries in the tables
MSreplication_subscriptions and MSsubscription_agents for this publication
and delete the entries if they exist.-- http://www.zetainteractive.com -
Shift Happens!
Looking for a SQL Server replication book?
http://www.nwsu.com/0974973602.html
Looking for a FAQ on Indexing Services/SQL FTS
http://www.indexserverfaq.com
"Ron" <Ron@.discussions.microsoft.com> wrote in message
news:AB12955C-2B1C-4C1A-9D11-7F47D8FB6A29@.microsoft.com...
> In the haste of trying to delete a publication/subscription combo, it
> looks
> as if it's stuck in limbo. I must have done something out of sequence.
> In Microsoft SQL Server Management Studio it appears the subscription on
> the
> secondary server has been deleted yet I can't delete the publication as it
> says it can't delete the subscription. When I try to add the subscription
> on
> the second server it says it already exists, even though It doesn't list
> itself under subscriptions.
> I'm trying to remove all replication and I'm wondering if it needs to be
> deleted manually through the database as something is out of sync.
> Thanks
> Ron
|||Thanks Hillary,
Before i saw your reply I executed the following and it worked:
EXEC sp_droppublication @.publication = N'xxxxxxx'
go
USE master
GO
EXEC sp_replicationdboption @.dbname = N'xxxxxxx',
@.optname = N'publish', @.value = N'false'
"Hilary Cotter" wrote:

> If this is a push subscription issue the following
> sp_dropsubscription @.publication= 'PublicationName', @.article= 'ALL',
> @.subscriber= 'SubscriberName'
> , @.destination_db= 'SubscriptionDatabase', @.ignore_distributor = 0If this
> does not clear it up do thissp_dropsubscription @.publication=
> 'PublicationName', @.article= 'ALL', @.subscriber= 'SubscriberName'
> , @.destination_db= 'SubscriptionDatabase', @.ignore_distributor = 1Then
> check your subscription database. See if there are entries in the tables
> MSreplication_subscriptions and MSsubscription_agents for this publication
> and delete the entries if they exist.-- http://www.zetainteractive.com -
> Shift Happens!
> Looking for a SQL Server replication book?
> http://www.nwsu.com/0974973602.html
> Looking for a FAQ on Indexing Services/SQL FTS
> http://www.indexserverfaq.com
> "Ron" <Ron@.discussions.microsoft.com> wrote in message
> news:AB12955C-2B1C-4C1A-9D11-7F47D8FB6A29@.microsoft.com...
>
>

Help - DTS package crashing with broken connection?

All,
Windows XP SP2 + hotfixes
SQL Server 2000 SP3
Has anybody seen this one? And is there a fix/workaround for it?
I have a DTS package that loads data in from CSV formatted flat files. The
package executes Ok for two small tables however it crashes about 10,000
records into a transformation task which is loading a flat file of 230,000+
records. Here's the error:
Step 'DTSStep_DTSDataPumpTask_3' failed
Step Error Source: Microsoft Data Transformation Services (DTS) Data Pump
Step Error Description:The number of failing rows exceeds the maximum
specified. (Microsoft OLE DB Provider for SQL Server (80004005): The
statement has been terminated.) (Microsoft OLE DB Provider for SQL Server
(80004005): Cannot insert duplicate key row in object 'Staff' with unique
index 'IX_StaffUserName'.)
Step Error code: 8004206A
Step Error Help File:sqldts80.hlp
Step Error Help Context ID:0
Step Execution Started: 9/9/2005 3:41:52 PM
Step Execution Completed: 9/9/2005 3:42:07 PM
Total Step Execution Time: 15.391 seconds
Progress count in Step: 42000
The task runs a VBScript and has 1 Lookup - a check to see if a row exists
before inserting. The Lookup is using the same connection as the Load {whic
h
I read is not good} but it must do so because the existence check is against
the table being loaded and loaded rows are not visible to other
connections...
Any ideas how I can a) circumvent this issue or b) allow load processing to
'quietly' fail on duplicate inserts but run to completion anyway?Look like you got a duplicate key problem when inserting data into Staff
table, violation against index IX_StaffUserName. Clean out your data before
import doing the import again or import those data into a staging table with
no restriction and then clean out your data before import to the main table.
Q
"Richard" wrote:

> All,
> Windows XP SP2 + hotfixes
> SQL Server 2000 SP3
> Has anybody seen this one? And is there a fix/workaround for it?
> I have a DTS package that loads data in from CSV formatted flat files. Th
e
> package executes Ok for two small tables however it crashes about 10,000
> records into a transformation task which is loading a flat file of 230,000
+
> records. Here's the error:
> Step 'DTSStep_DTSDataPumpTask_3' failed
> Step Error Source: Microsoft Data Transformation Services (DTS) Data Pump
> Step Error Description:The number of failing rows exceeds the maximum
> specified. (Microsoft OLE DB Provider for SQL Server (80004005): The
> statement has been terminated.) (Microsoft OLE DB Provider for SQL Server
> (80004005): Cannot insert duplicate key row in object 'Staff' with unique
> index 'IX_StaffUserName'.)
> Step Error code: 8004206A
> Step Error Help File:sqldts80.hlp
> Step Error Help Context ID:0
> Step Execution Started: 9/9/2005 3:41:52 PM
> Step Execution Completed: 9/9/2005 3:42:07 PM
> Total Step Execution Time: 15.391 seconds
> Progress count in Step: 42000
> The task runs a VBScript and has 1 Lookup - a check to see if a row exists
> before inserting. The Lookup is using the same connection as the Load {wh
ich
> I read is not good} but it must do so because the existence check is again
st
> the table being loaded and loaded rows are not visible to other
> connections...
> Any ideas how I can a) circumvent this issue or b) allow load processing t
o
> 'quietly' fail on duplicate inserts but run to completion anyway?
>
>|||I think this response from Louis was meanbt for you.
Chekc this KB article for the explanation:
http://support.microsoft.com/defaul...kb;en-us;329329
"Richard" <Richard@.discussions.microsoft.com> wrote in message
news:5A302543-FB5F-484B-9E7D-06943BEACC04@.microsoft.com...
> All,
> Windows XP SP2 + hotfixes
> SQL Server 2000 SP3
> Has anybody seen this one? And is there a fix/workaround for it?
> I have a DTS package that loads data in from CSV formatted flat files.
The
> package executes Ok for two small tables however it crashes about 10,000
> records into a transformation task which is loading a flat file of
230,000+
> records. Here's the error:
> Step 'DTSStep_DTSDataPumpTask_3' failed
> Step Error Source: Microsoft Data Transformation Services (DTS) Data Pump
> Step Error Description:The number of failing rows exceeds the maximum
> specified. (Microsoft OLE DB Provider for SQL Server (80004005): The
> statement has been terminated.) (Microsoft OLE DB Provider for SQL Server
> (80004005): Cannot insert duplicate key row in object 'Staff' with unique
> index 'IX_StaffUserName'.)
> Step Error code: 8004206A
> Step Error Help File:sqldts80.hlp
> Step Error Help Context ID:0
> Step Execution Started: 9/9/2005 3:41:52 PM
> Step Execution Completed: 9/9/2005 3:42:07 PM
> Total Step Execution Time: 15.391 seconds
> Progress count in Step: 42000
> The task runs a VBScript and has 1 Lookup - a check to see if a row exists
> before inserting. The Lookup is using the same connection as the Load
{which
> I read is not good} but it must do so because the existence check is
against
> the table being loaded and loaded rows are not visible to other
> connections...
> Any ideas how I can a) circumvent this issue or b) allow load processing
to
> 'quietly' fail on duplicate inserts but run to completion anyway?
>
>

Help - DTS Custom Task to handle Job Failuer

I need to find a better way to handle DTS Job Failure issue. Currently, we have about 50 jobs which executed through DTS packages. Everytime when sources were not there or came in late, the DTS sent out an email to my page which I carried every day. Some those came in during the holidays even though I know the source party wont generate the source files at that day.

Trying to avoid to get beep everytime when job was failed. Someone suggested that it is possible to add kind of executable file within the Custom task and let it trigger the DTS packages. If, for example, a holiday then dont run the package so I wont get the page.

Any hints suggestions would be greatly appreciated

J827I get unhappy when I see 0 replies, but I have not done this before but this is how I would handle it.

I would write a dll in VB. It would check todays date against a list of holidays I might keep in a database table. If it did'nt return a result I would would use the VB syntax for firing off stored procedures (still in my dll).

Of course before you do this you need to already have your 50 DTS packages in place.

I would then build a new DTS package whose only step was to fire off this custom DLL. Register your cutom task and assign it icon and then build your DTS package using the DTS Designer (not the wizard).

Good luck.|||create a table or array with the dates of all non action days (holidays weekends down days).
create a master package that runs all of your sub packages(use the execute package task).
create a Execute SQL (or activex)Task to evaluate the NonActiveDays table against the execution of the package if getdate() = @.nonactiveday, return 0
else 1
then pass that value as a global variable to the task
when the 1 hits, run the package
this is a fairly simple solution.

and just for giggles if your code returns a 0, you can have the task email you a reminder that the task didnt run becuase of a holiday.

this opens you up to many possibllilties, for instance you could have this code run in a job the day before to remind you that the next day is a holiday and dts wont run...|||thanks thanks!|||Are you executing this from the web like in ASP.Net or something?

If you are, the problem is that the DTS package needs to run in the same security context that you created it in. The IUSR_etc... account in IIS does not have the ability to execute the package. The workarounds are sloppy and I have discussed them on this forum before. Do a search for ASP and DTS.|||I created a C# Console Application to trigger the DTS package directly and it worked (see code below). But it was an EXE (not DLL) file type.

I would like to plug this app as part of control in an existing DTS package by adding it into a Custom Task. I just noticed the DTS Custom Task only allows adding DLL format file to be registered. Then I recreated the app with Class library Type and tried to register / run Regsvr32 appName. I got Could not find entry point for DLLRegisterServer in the DLL / appName.DLL was load but the DLLRegisterServer entry point was not found, the file cant be registered.

Any helps would be highly appreciated!

---------------------
try
{
DTS.Package2Class package = new DTS.Package2Class();
object pVarPersistStgOfHost = null;

string ServerName = "DEV";
string ServerUserName = null;
string ServerPassword = null;
string PackagePassword = null;
string PackageGUID = null;
string PackageVersionGUID = null;
string PackagName = "DTSTestPackage";


package.LoadFromSQLServer(
ServerName,
ServerUserName,
ServerPassword,
DTS.DTSSQLServerStorageFlags.DTSSQLStgFlag_UseTrus tedConnection,
PackagePassword,
PackageGUID,
PackageVersionGUID,
PackagName,
ref pVarPersistStgOfHost);

package.Execute();
package.UnInitialize();

// force Release() on COM object
System.Runtime.InteropServices.Marshal.ReleaseComO bject(package);

package = null;
}
catch(System.Runtime.InteropServices.COMException e)
{
Console.WriteLine("COMException {0}", e.ErrorCode.ToString() );
Console.WriteLine("{0}", e.Message);
Console.WriteLine("{0}", e.Source);
Console.WriteLine("Stack dump\n{0}\n", e.StackTrace);
}
catch(System.Exception e)
{
Console.WriteLine("Exception");
Console.WriteLine("{0}", e.Message);
Console.WriteLine("{0}", e.Source);
Console.WriteLine("Stack dump\n{0}\n", e.StackTrace);
}

HELP - Designing a Subscripion System for a Group of Reports

We're developing an ASP.NET 2.0 (C#) application that will act as a
front-end to SQL Server 2005 Reporting Services. We have an interesting
requirement that [we believe] can't be solved using RS out-of-the-box. So,
I'm writing today to get some ideas on possible solutions.
The requirement is that the app. needs to support, what we're calling, a
report group. A report group is simply a group of similar reports that can
be executed as a single unit (instead of having to execute each report
individually). This was not too bad to solve. We simply created our own
database tables that track which report group contains which reports.
The tricky part is that we now have to figure out a way to create a
subscription that executes the report group (thereby executing each
individual report in the report group automatically). Creating a
subscription for a single report is fairly straightforward, but it appears
this concept of a report group is not a part of RS.
Does anyone out there have any good ideas/suggestions on how we can
implement such functionality?
The best thought we have so far is to write another application that manages
the execution of each report in a report group. So, the SQL Server 2005 job
would call this additional little application, this additional app. would
lookup which reports are in the report group, and then it would loop through
all the reports and execute them one-by-one. But, this has some issues-like
how do we handle the output type (PDF, Excel, etc.) and how do we handle the
destination (e-mail, printer, etc.) of each report. Plus, it seems like we
would be somewhat reinventing the wheel as far as subscriptions go.
Thanks.Why aren't you using one master report with all the other reports as
sub-reports?
Tom Bizannes
oiduts wrote:
> We're developing an ASP.NET 2.0 (C#) application that will act as a
> front-end to SQL Server 2005 Reporting Services. We have an interesting
> requirement that [we believe] can't be solved using RS out-of-the-box. So,
> I'm writing today to get some ideas on possible solutions.
> The requirement is that the app. needs to support, what we're calling, a
> report group. A report group is simply a group of similar reports that can
> be executed as a single unit (instead of having to execute each report
> individually). This was not too bad to solve. We simply created our own
> database tables that track which report group contains which reports.
> The tricky part is that we now have to figure out a way to create a
> subscription that executes the report group (thereby executing each
> individual report in the report group automatically). Creating a
> subscription for a single report is fairly straightforward, but it appears
> this concept of a report group is not a part of RS.
> Does anyone out there have any good ideas/suggestions on how we can
> implement such functionality?
> The best thought we have so far is to write another application that manages
> the execution of each report in a report group. So, the SQL Server 2005 job
> would call this additional little application, this additional app. would
> lookup which reports are in the report group, and then it would loop through
> all the reports and execute them one-by-one. But, this has some issues-like
> how do we handle the output type (PDF, Excel, etc.) and how do we handle the
> destination (e-mail, printer, etc.) of each report. Plus, it seems like we
> would be somewhat reinventing the wheel as far as subscriptions go.
> Thanks.|||You could try setting up a shared schedule. It would still require
individual subscriptions for each report but they would all start
firing at the same scheduled time.

Help - deleting existing auto-generated primary key

hi guys,

just a question regarding database design

i have a table with an auto-generated primary key but the problem is this:

say i have 4 records,so logically they'll be numbered 1 to 4.so the problem is whenever i delete all records and add new ones,the numbering will start from 5 and not 1 again.

how do i remedy this?

thanx

Two ways:

1. Use TRUNCATE instead of delete if you are deleting *all* rows

2. Use "DBCC CHECKIDENT(TABLE_NAME_HERE, RESEED, 0)" to reset the current identiy value back to required position if you want to use "DELETE" instead of "TRUNCATE"

|||

Another common practice is to create some sort of an ID column as well as the auto-generated primary key. Sounds like this way work better for you in this case if you plan on frequently deleting, inserting, etc.

Thanks,
Sam Lester (MSFT)

|||

hi,

wanna ask.is TRUNCATE only used if i want to delete all records in a table?

how am i able to do this if i only want to delete a selected row and all rows? say i got 4 records and i only want to delete the 2nd record.

so will the 3rd record be numbered as 2 now or will it remain as 3?

Help - Cube Processing

All,
I m at a loss to understand whats happened to the data warehouse I manage.
Everything was OK up until about 2 weeks ago then overnight the daily Cube
processing jumped from about 15 minutes to nearly 45 minutes/Partition.
I've Defraged and reindexed all the fact tables, that helped for about 2
days but no has no effect.
The server config is
4 x Xeon 2.8 Processors (multi threading applied, effectivly 8 Pros)
4GB of Memory (AWE Disabled)
2GB Sys Paging File
5 Drives all Raid 0+1I would suggest that you scan through your system using the tips and tricks
suggested in the following two white papers:
http://www.microsoft.com/technet/pr...n/ansvcspg.mspx
http://www.microsoft.com/technet/pr...n/anservog.mspx
Both contain lots of great recommendations on how to deal with increased
processing times.
Dave Wickert [MSFT]
dwickert@.online.microsoft.com
Program Manager
BI SystemsTeam
SQL BI Product Unit (Analysis Services)
--
This posting is provided "AS IS" with no warranties, and confers no rights.
"Jason Lees" <JasonLees@.discussions.microsoft.com> wrote in message
news:6EAFB483-D8D0-4770-B1B3-78766465757F@.microsoft.com...
> All,
> I m at a loss to understand whats happened to the data warehouse I manage.
> Everything was OK up until about 2 weeks ago then overnight the daily Cube
> processing jumped from about 15 minutes to nearly 45 minutes/Partition.
> I've Defraged and reindexed all the fact tables, that helped for about 2
> days but no has no effect.
> The server config is
> 4 x Xeon 2.8 Processors (multi threading applied, effectivly 8 Pros)
> 4GB of Memory (AWE Disabled)
> 2GB Sys Paging File
> 5 Drives all Raid 0+1
>
>

Help - Cube Processing

All,
I m at a loss to understand whats happened to the data warehouse I manage.
Everything was OK up until about 2 weeks ago then overnight the daily Cube
processing jumped from about 15 minutes to nearly 45 minutes/Partition.
I've Defraged and reindexed all the fact tables, that helped for about 2
days but no has no effect.
The server config is
4 x Xeon 2.8 Processors (multi threading applied, effectivly 8 Pros)
4GB of Memory (AWE Disabled)
2GB Sys Paging File
5 Drives all Raid 0+1
I would suggest that you scan through your system using the tips and tricks
suggested in the following two white papers:
http://www.microsoft.com/technet/pro.../ansvcspg.mspx
http://www.microsoft.com/technet/pro.../anservog.mspx
Both contain lots of great recommendations on how to deal with increased
processing times.
Dave Wickert [MSFT]
dwickert@.online.microsoft.com
Program Manager
BI SystemsTeam
SQL BI Product Unit (Analysis Services)
This posting is provided "AS IS" with no warranties, and confers no rights.
"Jason Lees" <JasonLees@.discussions.microsoft.com> wrote in message
news:6EAFB483-D8D0-4770-B1B3-78766465757F@.microsoft.com...
> All,
> I m at a loss to understand whats happened to the data warehouse I manage.
> Everything was OK up until about 2 weeks ago then overnight the daily Cube
> processing jumped from about 15 minutes to nearly 45 minutes/Partition.
> I've Defraged and reindexed all the fact tables, that helped for about 2
> days but no has no effect.
> The server config is
> 4 x Xeon 2.8 Processors (multi threading applied, effectivly 8 Pros)
> 4GB of Memory (AWE Disabled)
> 2GB Sys Paging File
> 5 Drives all Raid 0+1
>
>

Help - Creating reports in pdf format

Hi all,

Greetings

I m using Sql Server 2000 for creating Reports. I like to display the ad-hoc generated report in pdf format. Do i have any pdf writter to display the report or i can able to do it in programmatically. If we can then plz give me the source code to write a pdf report.

Advance Thanks & Regards

Guna.

PDF creating is an outofthebox feature of Reporting Service. It is available as a rendering format in the report manager (Web interface) of each report, unless inactivated.

HTH, Jens K. Suessmeyer.

http//www.sqlserver2005.de

Help - Corrupted SQL Server 2K Log. Is this serious

Hello,

I have a SS2K (SP3) that is appending very wierd looking error messages
to the SQL Server Log (Current). The messages are not formatted as
other log entries. The messages suggest some kinda dump information.
The messages have the following characteristics:

Date Information is invalid
Source information is invalid
Message information is strange

The following is a message that appear in my current SQL Server Log

DATE:78008454
SOURCE: Module(MSVCRT+
Message (_endthread+000000C1)

There are many messages like this in the log.

Is this serious?

Thanks - CoviCovi (covilc@.earthlink.net) writes:
> I have a SS2K (SP3) that is appending very wierd looking error messages
> to the SQL Server Log (Current). The messages are not formatted as
> other log entries. The messages suggest some kinda dump information.
> The messages have the following characteristics:
> Date Information is invalid
> Source information is invalid
> Message information is strange
> The following is a message that appear in my current SQL Server Log
>
> DATE:78008454
> SOURCE: Module(MSVCRT+
> Message (_endthread+000000C1)
> There are many messages like this in the log.
> Is this serious?

When a connection runs into an execution error, one that really should
not occur, for instance an access violation, SQL Server terminates the
connection and leaves a stack dump in the error log. Most such dumps
are due to bugs in SQL Server, and could come from specific queries.
Hardware errors could be involved too. In the latter case, there is
certainly cause for alarm. If you want to sleep better at night, you
could run a DBCC CHECKDB on the database related to the dumps.

But stackdumps can occur on healthy machines too.

--
Erland Sommarskog, SQL Server MVP, esquel@.sommarskog.se

Books Online for SQL Server SP3 at
http://www.microsoft.com/sql/techin.../2000/books.asp

Help - Contiguous Date Range Query

I have a table that contains (among other things) start and end range datetimes. The ranges can overlap. View these datetimes as segments of coverage on a number line. Rather than returning each line segment individually, I'd like the query to return the start and the end of the contiguous segments. For example, the desired output of the query on the below table would be:

/* Start SQL */

/* Desired output: range_start_datetime range_end_datetime
01/01/2003 01/06/2003
01/08/2003 01/12/2003
*/

create table #ranges (range_start_datetime datetime, range_end_datetime datetime)

/* Range 1: 01/01/2003 to 01/06/2003 */
insert into #ranges(range_start_datetime, range_end_datetime) values ('01/01/2003', '01/02/2003')
insert into #ranges(range_start_datetime, range_end_datetime) values ('01/02/2003', '01/03/2003')
insert into #ranges(range_start_datetime, range_end_datetime) values ('01/02/2003', '01/06/2003')

/* Gap - no data at 01/07/2003 */

/* Range 2: 01/08/2003 to 01/12/2003 */
insert into #ranges(range_start_datetime, range_end_datetime) values ('01/08/2003', '01/09/2003')
insert into #ranges(range_start_datetime, range_end_datetime) values ('01/09/2003', '01/10/2003')
insert into #ranges(range_start_datetime, range_end_datetime) values ('01/10/2003', '01/12/2003')

select * from #ranges
drop table #ranges

/* End SQL */This version will work in Oracle at least:

SQL> select range_start_datetime,
2 (
3 select min(range_end_datetime)
4 from ranges r3
5 where not exists
6 ( select range_start_datetime from ranges r4
7 where r4.range_start_datetime <= r3.range_end_datetime
8 and r4.range_end_datetime > r3.range_end_datetime
9 )
10 and r3.range_end_datetime >= r1.range_start_datetime
11 )
12 from ranges r1
13 where not exists
14 ( select range_end_datetime from ranges r2
15 where r2.range_end_datetime >= r1.range_start_datetime
16 and r2.range_start_datetime < r1.range_start_datetime
17 );

RANGE_STAR (SELECTMIN
---- ----
01/01/2003 01/06/2003
01/08/2003 01/12/2003|||This works in Sybase as well. I had to change the first select to distinct when I added a few more test cases, but it works! I'm impressed with your skills.

If you get a moment, could you explain what your query is doing? I didn't even consider a 4-join solution.

Gump

Originally posted by andrewst
This version will work in Oracle at least:

SQL> select range_start_datetime,
2 (
3 select min(range_end_datetime)
4 from ranges r3
5 where not exists
6 ( select range_start_datetime from ranges r4
7 where r4.range_start_datetime <= r3.range_end_datetime
8 and r4.range_end_datetime > r3.range_end_datetime
9 )
10 and r3.range_end_datetime >= r1.range_start_datetime
11 )
12 from ranges r1
13 where not exists
14 ( select range_end_datetime from ranges r2
15 where r2.range_end_datetime >= r1.range_start_datetime
16 and r2.range_start_datetime < r1.range_start_datetime
17 );

RANGE_STAR (SELECTMIN
---- ----
01/01/2003 01/06/2003
01/08/2003 01/12/2003|||Probably easiest to start from here:

select range_start_datetime,
from ranges r1
where not exists
( select range_end_datetime from ranges r2
where r2.range_end_datetime >= r1.range_start_datetime
and r2.range_start_datetime < r1.range_start_datetime
);

All I have done is remove the second column from the main select (which was a "scalar subquery").

This query gets the start of each contiguous range, by finding all the start dates for which there does not exist a preceding record that meets or overlaps it, i.e.:

RANGE_STAR
----
01/01/2003
01/08/2003

Now let's look at the scalar subquery:

select min(range_end_datetime)
from ranges r3
where not exists
( select range_start_datetime from ranges r4
where r4.range_start_datetime <= r3.range_end_datetime
and r4.range_end_datetime > r3.range_end_datetime
)
and r3.range_end_datetime >= r1.range_start_datetime

This sort of does the opposite of the query above: it finds the end of each contiguous range, i.e. all the end dates for which there does not exist a following record that meets or overlaps it, i.e.

select range_end_datetime
from ranges r3
where not exists
( select range_start_datetime from ranges r4
where r4.range_start_datetime <= r3.range_end_datetime
and r4.range_end_datetime > r3.range_end_datetime
)

RANGE_END_
----
01/06/2003
01/12/2003

It then finds the MIN of those end dates where the end_date is on or after the start date of each record in the main query (i.e. corellated): for start date 01/01/2003 that will be 01/06/2003, and for start date 01/08/2003 that will be 01/12/2003.

Does that make sense?

HELP - Connecting SQL Express Edition

I have two machines and same problem.
I installed SQL EE on XP HOME EDITION it is OK !
Localy work OK !
I try to connect from another machine with SQL Server Management Studio and
I can't.
I some how successed to connect with named pipes but TCP/IP is not working.
I turn on all protocols with Menager. Shout down firewall on both machines.
I tried with SQL Native Client from code but nothing.
For now anly named pipes working.
What is wrong ?Have a look at these:
http://www.datamasker.com/SSE2005_NetworkCfg.htm Configuring Express
for Remote Access
https://blogs.msdn.com/sqlexpress/a.../05/415084.aspx
Configuring Remote Access
http://www.kbalertz.com/Feedback_914277.aspx Configuring SQL2005 for
remote access
http://msdn.microsoft.com/library/d...r />
rview.asp
Express Overview
http://msdn2.microsoft.com/en-us/library/ms165672.aspx Comparing Express
with MSDE
http://msdn.microsoft.com/sql/defau... />
erinst.asp
User Instances
http://msdn.microsoft.com/sql/expre...qlexcustapp.asp
Embedding Express in Apps
http://blogs.msdn.com/rogerwolterbl.../13/575974.aspx
Backing up Express
Automating Database maintenance in SQL 2005 Express Edition Part I
http://www.sqldbatips.com/showarticle.asp?ID=27
Automating Database maintenance in SQL 2005 Express Edition Part II
http://www.sqldbatips.com/showarticle.asp?ID=29
Andrew J. Kelly SQL MVP
"Soko" <sokcic@.hotmail.com> wrote in message
news:eQqmarSiGHA.4660@.TK2MSFTNGP03.phx.gbl...
>I have two machines and same problem.
> I installed SQL EE on XP HOME EDITION it is OK !
> Localy work OK !
> I try to connect from another machine with SQL Server Management Studio
> and I can't.
> I some how successed to connect with named pipes but TCP/IP is not
> working.
> I turn on all protocols with Menager. Shout down firewall on both
> machines.
> I tried with SQL Native Client from code but nothing.
> For now anly named pipes working.
> What is wrong ?
>|||Hi Soko,
I have a very similar (if not the same problem). Have you actually solve it?
Thanks
Zib
From http://www.developmentnow.com/g/100...ess-Edition.htm
Posted via DevelopmentNow.com Groups
http://www.developmentnow.com|||SQL Server Express installs as a named instance. Unless you change it the
instance is called <computername>\SQLExpress.
The SQL Server Browser service resolves the name, to the TCP/IP port number.
I suspect the SQL Server Browser service is not running on your computer. If
you start it, then you may be able to connect.
"Soko" <sokcic@.hotmail.com> wrote in message
news:eQqmarSiGHA.4660@.TK2MSFTNGP03.phx.gbl...
>I have two machines and same problem.
> I installed SQL EE on XP HOME EDITION it is OK !
> Localy work OK !
> I try to connect from another machine with SQL Server Management Studio
> and I can't.
> I some how successed to connect with named pipes but TCP/IP is not
> working.
> I turn on all protocols with Menager. Shout down firewall on both
> machines.
> I tried with SQL Native Client from code but nothing.
> For now anly named pipes working.
> What is wrong ?
>

Help - Conditional Checks within a SQL Query - other ways of doing it?

I'm trying to simplify a SQL Stored Procedure.

The query accepts an int, @.ItemTypeID int

I have the Query:

SELECT ... FROM ItemList WHERE ItemTypeID = @.ItemTypeID

or, if @.ItemTypeID is 0,

SELECT ... FROM ItemList


Is there a way to do this query without doing:

IF @.ItemTypeID = 0
BEGIN
...SELECT QUERY...
END
ELSE
BEGIN
...SELECT QUERY...
END

?

Try if you can combine the query by using joins. If not, then stay with the current. A Sql stored procedure is not meant to be very readable like C#, it goes by the execution plans. Hence even the dynamica sql should be used reluctantly.

|||

select... from itemList where (@.itemTypeID = 0) or (@.ItemTypeID = ItemTypeID)

|||

Yes, although what you have will execute the fastest.

If you feel it's not worth the performance difference, you can always do this:

SELECT * FROM ItemList WHERE (ItemTypeID=@.ItemTypeID OR @.ItemTypeID=0)

|||

Ok, thanks guys.

I guess the extra

WHERE (ItemTypeID=@.ItemTypeID OR @.ItemTypeID=0)

Adds a check on each row that is returned - which slows it down. I'll keep the current ugly method :)

Just sometimes I might have to check 5-6 things... which looks very ugly.

|||

I'd be amazed if you could measure the difference.

Help - Complex Currency Conversion - MDX

Dear all,

I have implemented Conversion conversion in my SSAS cube (One - Many) and is working fine. (Reporting Currency dimension that has currencies to which data can be converted, Fx rates measure group that has FX rates for those currencies from base currency (GBP) and calculation to convert it) The calculation I use is as below

Code Snippet

Scope ( {Measures.[YTD Value Reporting CCY], Measures.[LTD Value Reporting CCY]});

Scope (Leaves ([Calendar]), Except([Reporting Currency].[Reporting Currency].[Reporting Currency].Members , [Reporting Currency].[Reporting Currency].&[GBP]));

Scope ( {Measures.[YTD Value Reporting CCY], Measures.[LTD Value Reporting CCY]});

This = [Reporting Currency].[Reporting Currency].[GBP] *

( Measures.[Rate],

[Reporting Currency].[Reporting Currency].CurrentMember

);

End Scope;

End Scope;

End Scope;

Now I have a situation in which the FX rates can be different for each record in the fact table. So I have introducted the FX Rate Set ID dimension (degenerate dimension for the rate measure group) and also have it as a measure (non-additive, visible=false) for the Balance measure group that contain the data to be converted. Now, how do I acheive the actual conversion by modifying the above MDX?

Something similar to the below is what I'm trying to acheive but no idea how to get it right in MDX. Please help in the correct approach and the right MDX.

Code Snippet

This = [Reporting Currency].[Reporting Currency].[GBP] *

( Measures.[Rate],

[Reporting Currency].[Reporting Currency].CurrentMember,

[FX Rate Set].[Set ID].[<<'Value of Measures.[Rate Set ID]'>>]

);

Thanks in advance.

Cheers,

Arun

Pl can anyone help me on this?

|||Could you explain this statement with an example: "I have a situation in which the FX rates can be different for each record in the fact table"? Are you referring to the Balance or the FX Rate fact table - and does this mean that FX rates may vary on the same day?|||

Below is the sample data

Fx Rates : (the pivot currency is GBP)

Date RateSet ToCurrency Rate

03 May 2007 1 AUD 2.54

03 May 2007 2 AUD 2.51

03 May 2007 1 GBP 1

03 May 2007 2 GBP 1

03 May 2007 1 EUR 1.51

03 May 2007 2 EUR 1.6

Fact Table:

ID Date <<Other dimension keys>>> Currency Balance BalanceInGBP FXRateSet

101 03 May 2007 USD 100 195 1

102 03 May 2007 USD 100 194 2

Measure Group:

Balance MG:

Balance (FactTable.Balance)

BalanceReportingCurrency (FactTable.BalanceInGBP)

Rates:

Rate ([Fx Rates].Rate)

Calculation :

In my first post... (the column names may be bit different.)

In the above scenario, the for the same date, I may have to use different "Fx rate Set" whose ID is present in both Fact table and Fx rates table. I have a dimension called Currency that joins with the Currency column in fact table. I also have another dimension called [Reporting Hierarchy] that will hold the ToCurrency in Fx Rates and based on which the conversion will be done dynamically using calculations (The BalanceReportingCurrency measure will be calcualted dynamically using corresponding rates).

|||Forgot to mention - I have the FXRateSetID from the fact table as a measure in the balance MG as it may help in doing the calculation|||One idea is to slightly change the original solution by creating a new dimension like DailyRateSet, whose key attribute is a collection of the 2 fields: Date, RateSetID. This dimension would be used as the intermediate dimension, instead of Calendar. Another approach would be with a separate RateSet dimension, but I haven't been able to work that out yet.|||

Finally got it working. I'm having the Rate Set as another dimension referred by both fact table and the Fx rates table, then whilst doing the calculation, I perform the calc at leaf levels of both Fx rate set and Time - and using the fx rate set dimension for getting the correct rate.

At present, I convert all values to GBP when laoding to my cube. To make the data independent of rate change problems, now I'm trying to work out many-to many currency conversion.