Showing posts with label number. Show all posts
Showing posts with label number. Show all posts

Thursday, March 22, 2012

Build 9.0.3050

I downloaded SP2 on March 10, 2007, and applied it. The system to which I applied it reports 9.0.3050 as the version number, not 9.0.3042. Is this due to the post-SP2 fix?My system, which was updated with the bugged SP2, and then the subsequent hotfix, also reports 9.0.3050. So that seems like a reasonable assumption.|||

refer this link for latest fixes in SQL Server 2005...

http://support.microsoft.com/kb/933508

SP2 Release version was --9.00.3042.00

Latest version with this fix(Mar 05) is -- 9.00.3050.00

Madhu

Wednesday, March 7, 2012

BUG in BIDS when developing SSIS packages

There seems to be a BUG in BIDS when developing SSIS packages using the Import/Export Data wizard.

If you use the wizard to import a large number of tables, and then select all the tables, and then choose to delete exisiting data in each table, the PrologueSQL file does NOT get built correctly. Instead of having a

TRUNCATE tablename Go

for each table, it just has a bunch of "Go"s with nothing between them. In the step immediately prior, where you confirm what the wizard will do, it tells you, after each table, that it will delete any existing data...but it doesn't do this.

If, during the wizard, I select each individual table one at a time and tell it to delete existing data, then it will get built correctly, but not if I select them all at once...YET, if I do select the whole block, choose delete existing data, and then select any single table, it shows that table as being set up to delete existing rows.

This is very frustrating when trying to import large numbers of tables.

Am I missing something? or is this really a bug?

Thanks, Jeff

Jeff,

We've noticed the same behavior here. If we edit the mappings for each of the tables individually we see that the 'Delete rows in destination table' is selected (after selecting that option for the group). If we click OK for the mappings dialog the TRUNCATE options is then correctly added to the package.

I vote for bug.

Ray

|||Post it at:

http://connect.microsoft.com/sqlserver/feedback|||

Yes, this is a known bug.

The best workaround I can offer is to unselect the "Optimize for Many Tables" checkbox. The transfer will probably succeed if the number of tables is resonable (say less than 100) for your hardware (transfers will go in parallel). If it fails, try to go in multiple batches with smaller number of tables.

Thanks.

BUG in BIDS when developing SSIS packages

There seems to be a BUG in BIDS when developing SSIS packages using the Import/Export Data wizard.

If you use the wizard to import a large number of tables, and then select all the tables, and then choose to delete exisiting data in each table, the PrologueSQL file does NOT get built correctly. Instead of having a

TRUNCATE tablename Go

for each table, it just has a bunch of "Go"s with nothing between them. In the step immediately prior, where you confirm what the wizard will do, it tells you, after each table, that it will delete any existing data...but it doesn't do this.

If, during the wizard, I select each individual table one at a time and tell it to delete existing data, then it will get built correctly, but not if I select them all at once...YET, if I do select the whole block, choose delete existing data, and then select any single table, it shows that table as being set up to delete existing rows.

This is very frustrating when trying to import large numbers of tables.

Am I missing something? or is this really a bug?

Thanks, Jeff

Jeff,

We've noticed the same behavior here. If we edit the mappings for each of the tables individually we see that the 'Delete rows in destination table' is selected (after selecting that option for the group). If we click OK for the mappings dialog the TRUNCATE options is then correctly added to the package.

I vote for bug.

Ray

|||Post it at:

http://connect.microsoft.com/sqlserver/feedback|||

Yes, this is a known bug.

The best workaround I can offer is to unselect the "Optimize for Many Tables" checkbox. The transfer will probably succeed if the number of tables is resonable (say less than 100) for your hardware (transfers will go in parallel). If it fails, try to go in multiple batches with smaller number of tables.

Thanks.

Tuesday, February 14, 2012

Breaking the 8kb barrier on UDT

Is it possible by any kind of workaround to break the 8kb limit on user-defines datatypes?

My datatype can contain an arbitrary number of double-precision points meaning that I in best case only can store 512 points (2 x 8 x 512). there's a few extra bytes used for something else, but this is roughly the maximum, which is far from what I in many cases need. I serialize the object myself to ensure that I only store what I really need.

Not really, if you still want to have your UDT in the database.

What you could do is to send down to the database and a SQLCLR proc/function a binary blob and insert that into the db in a varbinary(max) field. On the client you would then retrieve the binary and re-populate intio your type.

Niels

Break on Running Total

I am trying to create Packing List(s) based on an individual sales order.

Each Packing List page must be based upon the number of square feet.

I have set up a detail line with
Pieces, Width, Length and a running total of square feet (Pieces*width*length)

When the Running Total Square Feet > 400 then I need to go to a new page and reset the Running Total Square Feet and continue to print the remaining detail, until it again reaches 400 square feet.

I'm close, but I am having trouble. Can anyone help?

Note: Its MAS 90, if that makes a difference.What did you try and what problem are you facing?