🔍 Extract Field Names Containing 'type' (Integer Fields Without Domain) from GDB Using ArcPy

  ⚙️ How the Script Works 🗂️ Geodatabase Setup The script starts by pointing to a target File Geodatabase (.gdb) and initializing a CSV ...

Thursday, May 1, 2025

🔄 Batch Append Feature Classes from Multiple GDBs into a Final Data Model (ArcPy)

 

🔄 Batch Append Feature Classes from Multiple GDBs into a Final Data Model (ArcPy)

In GIS data integration workflows, especially when working with multiple sources contributing to the same data model, it's common to append data from various geodatabases into a final unified schema.

This script automates that process using ArcPy, respecting both:

  • Dataset-based feature classes, and

  • Root-level feature classes.


📁 Folder Setup

  • final_gdb: The destination GDB that contains your field-validated feature classes (the data model).

  • folder_path: A folder with multiple source .gdb files. These should match the structure of the final GDB (names must match for auto-append to work).


🐍 ArcPy Script

python
import arcpy import os # Path to the geodatabase containing the final feature class models (with correct fields) final_gdb = r"C:\Work\Projects\Zatca\UG_Data\5Th\Merge\Batha_Total.gdb" # Folder containing the GDB files to append from folder_path = r"C:\Work\Projects\Zatca\UG_Data\5Th\DataBaseBathaNew" # Get a list of all datasets and feature classes in the final GDB (these are the models) arcpy.env.workspace = final_gdb final_datasets = arcpy.ListDatasets() or [] final_feature_classes = arcpy.ListFeatureClasses() # Iterate through each GDB in the folder containing the source feature classes for gdb_name in os.listdir(folder_path): gdb_path = os.path.join(folder_path, gdb_name) # Check if the item is a valid File Geodatabase if os.path.isdir(gdb_path) and gdb_name.endswith('.gdb'): arcpy.env.workspace = gdb_path # List all datasets and feature classes in the current source GDB source_datasets = arcpy.ListDatasets() or [] source_feature_classes = arcpy.ListFeatureClasses() # Process datasets for final_dataset in final_datasets: if final_dataset in source_datasets: final_dataset_path = os.path.join(final_gdb, final_dataset) source_dataset_path = os.path.join(gdb_path, final_dataset) # Set workspace to dataset for feature class listing arcpy.env.workspace = source_dataset_path source_fc_list = arcpy.ListFeatureClasses() arcpy.env.workspace = final_dataset_path final_fc_list = arcpy.ListFeatureClasses() for final_fc in final_fc_list: if final_fc in source_fc_list: source_fc_path = os.path.join(source_dataset_path, final_fc) final_fc_path = os.path.join(final_dataset_path, final_fc) arcpy.Append_management([source_fc_path], final_fc_path, "NO_TEST") print(f"✅ Appended data from '{source_fc_path}' to '{final_fc_path}'.") # Process individual feature classes not in datasets arcpy.env.workspace = gdb_path for final_fc in final_feature_classes: if final_fc in source_feature_classes: source_fc_path = os.path.join(gdb_path, final_fc) final_fc_path = os.path.join(final_gdb, final_fc) arcpy.Append_management([source_fc_path], final_fc_path, "NO_TEST") print(f"✅ Appended data from '{source_fc_path}' to '{final_fc_path}'.")

✅ Good To Know

  • This script assumes schema consistency between source and target FCs.

  • "NO_TEST" option tells ArcPy not to check schema compatibility (use only if you've pre-validated).

  • You can switch to "TEST" if you want to enforce field-by-field schema checks.