I need to copy a file gdb from my local machine to a server using a remote desktop connection. I have been doing this using a simple copy/paste operation in Windows Explorer. I know that it is best practice to do this using ArcCatalog, but in this situation I can't.
What I would like to know is how risky this procedure is. I know there is a chance that there may still be locks on the datasets, or other processes accessing it, which could cause the data to be copied partially/incorrectly. I have indeed noticed this where just having ArcCatalog open on my local machine (and not inside the folder the containing the gdb) could cause the paste to fail/complete incorrectly.
What I have found is that it is quite inconsistent. Yesterday I had an mxd open which was referencing the file gdb. I then copied it through Windows Explorer and pasted it on the server. I was able to work with the contents on the server without a problem.
Today, the first thing I did after logging in was to copy and paste the same file gdb onto the server again. When I tried to open it in ArcMap, it was identified as a normal folder test.gdb
and not an actual geodatabase with the name test
.
Would using something like TeraCopy help in this case, or do I just have to hope that all the data will get carried over each time? I see a similar question noted that ArcCatalog is the safest for moving the gdb (I agree, I would not dare to cut and paste the gdb through Windows).
Best Answer
From ESRI:
http://help.arcgis.com/en/arcgisdesktop/10.0/help/index.html#//003n0000007v000000
Your best bet is to script this using ArcPy if possible. Essentially, anything other than an ESRI product to do this will probably lead to problems. You may be able to write a script using ogr2ogr with filegdb support that could copy all the feature classes in the gdb, but I dont believe you can copy the whole thing at once.
See:
http://help.arcgis.com/en/arcgisdesktop/10.0/help/index.html#//001700000051000000
from:
Copy file geodatabase using Python?