🔍 Extract Field Names Containing 'type' (Integer Fields Without Domain) from GDB Using ArcPy

  ⚙️ How the Script Works 🗂️ Geodatabase Setup The script starts by pointing to a target File Geodatabase (.gdb) and initializing a CSV ...

Monday, May 5, 2025

🧩 Batch Append Feature Classes from Multiple GDBs into a Final Data Model (ArcPy)

🧩 Batch Append Feature Classes from Multiple GDBs into a Final Data Model (ArcPy)

In GIS data integration workflows, especially when working with multiple sources contributing to the same data model, it's common to append data from various geodatabases into a final unified schema. The following script helps automate that process, ensuring that both dataset-based feature classes and root-level feature classes are correctly appended into a final geodatabase.

This Python script using ArcPy checks each source geodatabase, compares datasets and feature classes, and appends the data without altering the target schema.


🚀 Key Steps in the Script:

  1. Identify and list feature classes from both dataset-based and root-level sources.

  2. Append data from the source GDBs to the final target data model, ensuring consistency and minimal manual intervention.


🧑‍💻 Python Script:

python
import arcpy import os # Path to the geodatabase containing the final feature class models (with correct fields) final_gdb = r"C:\Work\Projects\Zatca\UG_Data\5Th\Merge\Batha_Total.gdb" # Folder containing the GDB files to append from folder_path = r"C:\Work\Projects\Zatca\UG_Data\5Th\DataBaseBathaNew" # Get a list of all datasets and feature classes in the final GDB (these are the models) arcpy.env.workspace = final_gdb final_datasets = arcpy.ListDatasets() or [] final_feature_classes = arcpy.ListFeatureClasses() # Iterate through each GDB in the folder containing the source feature classes for gdb_name in os.listdir(folder_path): gdb_path = os.path.join(folder_path, gdb_name) # Check if the item is a valid File Geodatabase if os.path.isdir(gdb_path) and gdb_name.endswith('.gdb'): arcpy.env.workspace = gdb_path # List all datasets and feature classes in the current source GDB source_datasets = arcpy.ListDatasets() or [] source_feature_classes = arcpy.ListFeatureClasses() # Process datasets for final_dataset in final_datasets: # Check if the source GDB has a matching dataset if final_dataset in source_datasets: final_dataset_path = os.path.join(final_gdb, final_dataset) source_dataset_path = os.path.join(gdb_path, final_dataset) # Set workspace to the datasets for listing feature classes inside them arcpy.env.workspace = source_dataset_path source_fc_list = arcpy.ListFeatureClasses() arcpy.env.workspace = final_dataset_path final_fc_list = arcpy.ListFeatureClasses() # Iterate through each feature class in the final dataset for final_fc in final_fc_list: if final_fc in source_fc_list: # Append the matching feature class source_fc_path = os.path.join(source_dataset_path, final_fc) final_fc_path = os.path.join(final_dataset_path, final_fc) # Append data using 'NO_TEST' option arcpy.Append_management([source_fc_path], final_fc_path, "NO_TEST") print(f"Appended data from '{source_fc_path}' into '{final_fc_path}'.") # Process individual feature classes not in datasets arcpy.env.workspace = gdb_path # Reset to the root GDB for final_fc in final_feature_classes: if final_fc in source_feature_classes: source_fc_path = os.path.join(gdb_path, final_fc) final_fc_path = os.path.join(final_gdb, final_fc) # Append data using 'NO_TEST' option arcpy.Append_management([source_fc_path], final_fc_path, "NO_TEST") print(f"Appended data from '{source_fc_path}' into '{final_fc_path}'.")

🔍 Key Features of the Script:

  • Efficient data management: It ensures that the data from multiple sources is appended correctly into a unified final data model.

  • Flexible workspace management: The script intelligently handles datasets and feature classes both within datasets and at the root level, making it adaptable to various geodatabase structures.

  • ArcPy-powered appending: The use of ArcPy's Append_management function makes the data appending process fast and reliable without altering the target schema.


✅ Conclusion:

This script is useful when you have multiple geodatabases (GDBs) contributing data into a final schema and you want to automate the data integration process. Whether the data is inside datasets or at the root level of the GDB, this solution ensures that the final GDB remains updated and consistent.

No comments:

Post a Comment