I have several file geodatabases I've downloaded from a content provider. A subset of them, which I'm filtering by a couple of strings, have 1 feature dataset within (containing many feature classes) that I'd like to list and copy – at the feature dataset level – into one final geodatabase.
I am running into problems with getting the feature datasets isolated and listed.
Here is my code:
import os, zipfile, arcpy
pathPrefix = "C:\\Incoming"
newPath = '/FINAL'
try:
os.makedirs(pathPrefix + newPath)
except:
pass
for f in os.listdir(pathPrefix):
<DO UNARCHIVING/EXTRACTION HERE>
Here's the part I'm having issues with:
fileFilter = ['_DA_10', '_DB_10']
for path, subdirs, files in os.walk(pathPrefix + newPath):
for name in subdirs:
if name.endswith(".gdb"):
if any(x in name for x in fileFilter):
print(name) ## This gives me the correct FGDBs I'm looking for
workspace = os.path.join(pathPrefix, name)
walk = arcpy.da.Walk(workspace, datatype = "FeatureDataset")
for dirpath, dirnames, filenames, in walk:
for filename in filenames:
print filename
It runs, but gives me no feature datasets listed.
Best Answer
What about just using using
arcpy.ListDatasets
?Then, test the name of the feature dataset to see if its the one you want, if so then copy it. Copying the feature dataset will get you all the FCs in it.
Here's a quick test for setting the workspace and getting FDs, you can do this in the Python window of ArcMap or Catalog: