[GIS] How to update a Geoprocessing Service when its Python script changes

arcgis-servergeoprocessing-service

I want to use a Python script to perform some analysis, and publish this via an ArcGIS Server Geoprocessing Service.

A quick tour of publishing a geoprocessing service says to run the Python script in ArcMap to get a Geoprocessing Result – it's actually this result which you publish as a GP service.

Does this mean that every time you make any changes to the Python script, you need to:

  • re-run the Python script (in ArcMap) in order to obtain a new Result
  • publish the Result to ArcGIS Server (over-writing the last GP service)?

I'm hoping there's a faster way to update the script as this will very quickly become tedious during testing…

Best Answer

No, you don't need to re-run the script tool and republish the result. You will need to that only if you make any changes to the tool parameters (adding/removing/changing data type). This is required because if you take a look at C:\arcgisserver\directories\arcgissystem\arcgisinput\REF01\%Gpservicename%.GPServer\extracted\v101, you will find a toolbox which contains your script as well as the result. You cannot make modifications to the toolbox published, this will not be saved even though it is editable. If you will perform changes to the tool often, consider using the Python script for automating the process of publishing the GP result, there are many samples for this.

There is nothing that can stop you from going into the folder and editing the Python script published directly (just copy/paste the code) - it will work in most cases except when while publishing some of your variables were replaced by internal Esri variables. Please don't do that. It is so easy to end up having your source and published scripts not in sync, and it will get messy quite quickly.

The best practice I came to while working last two years on the GP services is to split the code files and the tool itself. Let me explain below.

Create a Python file (I refer to this as Caller file).

import sys
import socket #or just hardcode the machine name and the path; use UNC for shared folder
sys.path.append(r"\\" + socket.gethostname() + "%path to the actual code files")
import codefile1
import codefile2

Param1 = arcpy.GetParameterAsText(0) 
Param2 = arcpy.GetParameterAsText(1)

def mainworkflow(Param1,Param2):
    """General function-caller for codefiles"""
    Result = codefile1.functionName(Param1,Param2)
    return Result

if Param1 =="" and Param2 =="": #to have empty default values after publishing
    #for GP script tool publishing only purposes
    Result = ""
else:
    Result = mainworkflow(Param1,Param2)

Make a tool from this Python file specifying the parameters in the dialog box. Now this will be published as a GP service, and you can create and work on new Python files which will contain only the code that actually does the job. Whenever you realize that you need to split your code into multiple files - is just about importing the file from the Caller and calling the functions.

After performing the changes in the code, feel free to run the GP service directly - the Caller Python file will import the codefile1 Python file at the folder you specified and execute the code. No restart of the GP service, no reimport is required. As simple as that. I have many GP services I take care (~3 thousands of lines of code and ~20 Python modules). This approach is very efficient and I am happy I am using it.

In order to be able to access the Python files (modules you import), you should make sure that the folder where they are stored are accessible to the ArcGIS Server Account. This is because the GP service is being run under this account and it needs to access the service's resources.

Related Question