sql server - SqlPackage not exporting entire database -
i'm trying move large database (50gb) azure. running command on local sql server generate bacpac can upload.
sqlpackage.exe /a:export /ssn:localhost /sdn:mdbilling /su:sa /sp:somepassword /tf:"d:\test.bacpac"
the export not print errors , finishes "successfully exported database , saved file 'd:\test.bacpac'."
when @ bacpac in file system, comes out 3.7gb. there's no way 50gb database can compressed small. upload azure regardless. package upload succeeds, when query azure database of tables return 0 rows. it's if bacpac not contain database's data.
are there known limitations export method? database size, data types, etc?
i tried using 64bit version of sqlpackage reading experienced out of memory issues on large databases, wasn't getting error or error matter.
update/edit: made progress after ensuring export transactionally consistent restoring backup , extracting bacpac that. however, have run new error when uploading azure.
i receive following message (using s3 database):
error encountered during service operation. data plan execution failed message 1 or more errors occurred. 1 or more errors occurred. 1 or more errors occurred. 1 or more errors occurred. xml parsing: document parsing required memory 1 or more errors occurred. xml parsing: document parsing required memory
the problem resolved. issues two-fold.
first, because bacpac operations not transactionally consistent, had restore backup , make bacpac out of restored database. ensured users not adding rows while bacpac being generated.
second issue xml column in database. table has 17 million rows , of rows 250 them had large xml documents stored in them (200000+ characters). removing 250 rows , them reimporting solved problems. don't think size of xml document azure had issue with. think large documents contained special characters xml parser didn't like.
it's unclear me how sql server allows unparseable xml database in first place, other issue.
Comments
Post a Comment