[GIS] Multiple python import from feature class into feature dataset in geodatabase

arcpyfeature-classfeature-datasetfile-geodatabaseimport

Is it possible to create a python script to import shapefiles/feature classes, from different sources into specific feature datasets in geodatabase, but in one script?

I managed to write the script to make multiple feature datasets in geodatabase. This is the code:

>>> import arcpy 
    arcpy.CreateFileGDB_management("D:/GIS_Temp", "TEST.gdb")
    fdList = ["Datase_A", "Dataset_B", "Dataset_C"]
    for fd in fdList:
    arcpy.CreateFeatureDataset_management("D:/GIS_Temp/TEST.gdb", fd, "D:/GIS_Temp/Projection.prj")

And now i would like to import shapefiles/feature classes into specific datasets. For example, from "folder A", to "Dataset_A", "folder B" to "Dataset_B", and so on.

I could do it one script by one, like this:

>>> import arcpy
    arcpy.env.workspace = 'D:/GIS_Temp/Folder_A'
    arcpy.FeatureClassToGeodatabase_conversion(["File_X", "File_Y"],
    'D:/GIS_Temp/Test.gdb/Dataset_A')

But is it possible to do all it in one script. I am really new to python scripting, so I really don't know to combine those separate scripts and do the task.

If I use this code, i can get shapefiles into root of database, but not in feature dataset:

>>> import arcpy
    from arcpy import env
    import os
    env.workspace = "D:/GIS_Temp/Folder_A/"
    fcList = arcpy.ListFeatureClasses()
    for fc in fcList:
    arcpy.CopyFeatures_management(fc, "D:/GIS_Temp/Test.gdb/" + os.sep + fc.rstrip(".shp"))

And if I add name of dataset like this, it won't run:

arcpy.CopyFeatures_management(fc, "D:/GIS_Temp/Test.gdb/Dataset_A" + os.sep + fc.rstrip(".shp"))

This could work:

>>> import arcpy
    arcpy.env.workspace = 'D:/GIS_Temp/Folder_A'
    arcpy.FeatureClassToGeodatabase_conversion(["File_X", "File_Y"],
    'D:/GIS_Temp/Test.gdb/Dataset_A')

>>> import arcpy
    arcpy.env.workspace = 'D:/GIS_Temp/Folder_B'
    arcpy.FeatureClassToGeodatabase_conversion(["File_X", "File_Y"],
    'D:/GIS_Temp/Test.gdb/Dataset_B')

but only from one folder, to one dataset. I would like to write ONE code that would allow me to import defined shapefiles/classes, into specific feature datasets.

Best Answer

I suspect that your script is failing because the output already exists, to avoid this set overwrite to True, which is easier than check and delete, but may not be what you need in the long run.

Putting together your scraps into a contiguous codeblock:

# set overwrite = True so it won't crash if 
# the output already exists
arcpy.env.overwriteOutput = True

# setup your from/to as lists of the same length
fdList  = ["Datase_A", "Dataset_B", "Dataset_C"]
folList = ["c:\\path\\Folder A", "c:\\path\\Folder B", "c:\\path\\Folder C"]

# create a range [0,1,2] to index the lists
workRange = range(len(fdList))

for thisIndex in workRange:
    # step through the lists one by one
    fd = fdList[thisIndex]
    arcpy.env.workspace = folList[thisIndex]

    arcpy.CreateFeatureDataset_management("D:\\GIS_Temp\\TEST.gdb", fd, "D:\\GIS_Temp\\Projection.prj")
    for impFC in arcpy.ListFeatureClasses():
        fcName,fcExt = os.path.splitext(impFC) # split into name and extension
        fcName.replace(" ","_") # get rid of spaces.

        # os.path.join is your friend here, it will join the components 
        # of paths with your OS path separator
        arcpy.FeatureClassToFeatureClass_conversion(os.path.join(folList[thisIndex],impFC),os.path.join("D:\\GIS_Temp\\TEST.gdb",fd),fcName)

will match up your folders to feature datasets and import each feature class.. beware that names must be unique as I said in the comments; this code does not check for that.