Hi,
I have a giant DB that I want to backup. One of it's table is taking a lot of space, and I don't need it in the backup, so I am searching for a way to backup everything but that table in particular.
Does any of you know of a way?
Thanks!
BTW, it's a Microsoft SQL Server 2000 - 8.00.2040 (Intel X86)Only way to do it is pretty ugly. You would need to BCP out all the data, and script all the objects in the database. This would still miss users and groups, however.|||Thanks for the quick answer man!
Then I'll change the question:
My original need is to setup a test DB server. I thought the best & easiest way to build it up was with a backup/restore. Of course, I came across this issue: my test server cannot hold all the data, and getting rid of this table was the solution.
Any bypass solution?
Thanks again,
Francisco.|||Slightly easier, then would be to generate the SQL script of all objects, and run that on the test database. After this, you can import any set of tables (and even a portion of the big table) into the test server. You will still need to create users, roles and make sure the permissions are correct, but the script will likely take care of most of that.|||There are ways to accomplish this IF you can change the production system. That isn't usually an option, so for now I'll disregard it.
I'd approach this problem by:
1) Scripting out the DDL needed to create the tables only (no constraints, no indicies, no code, no permissions, just the tables). Play this script into the currently empty test database.
2) Use either BCP or DTS to copy the data from the tables you need copied from the production database to the test database.
3) Script out the remainder of the schema (the parts you left out of the first script) from the production database, and play that script in the populated test database.
-PatPsql
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment