Using PowerShell to automatically find and package log files for ArcGIS Server, Portal for ArcGIS, and ArcGIS DataStore.


Have you ever submitted a support case and, as a part of the troubleshooting process, been asked to provide logs for one (or multiple) components of a base ArcGIS Enterprise deployment?

Esri provides documentation on where the default logging location is for various ArcGIS Enterprise components, but this is a user-configurable path.

Default Logging Locations

  • ArcGIS Server
    • C:\arcgisserver\logs – as per here.
  • Portal for ArcGIS
    • C:\arcgisportal\logs – as per here.
  • ArcGIS DataStore
    • C:\arcgisdatastore\logs – as per here.

Maybe you go to check these default directories and you find that the directory doesn’t exist, or that there aren’t any logs present? You ask yourself, “Am I working with an ArcGIS Enterprise environment that has had logging locations modified from its defaults? What’s the process for figuring out where the logs might be?

Depending on your enterprise configuration, as well as the enterprise component in question, there could be a number of places where the log files are being stored.

If you wanted to manually find the location for each component, you could:

ArcGIS Server

Go to ArcGIS Server Manager > Logs > Settings, and identify the value specified for “Log file path“.

Portal for ArcGIS

Go to Portal Administrator Directory > Logs > Settings, and identify the value specified for “Log Location“.

ArcGIS Data Store

  • Open a terminal (cmd, PowerShell, Windows Terminal) as administrator
  • Go to the “C:\Program Files\ArcGIS\DataStore\tools” directory*.
  • Run the “describedatastore” command and identify the value specified for “Log location“.

*Assuming you are using the default installation directory for ArcGIS Data Store.

The PowerShell alternative

Sure, checking those things manually works if you have a relatively simple environment that doesn’t regularly change. But what if you are in an organisation with multiple ArcGIS Enterprise deployments across different environments (e.g. test, dev, UAT, production)?

What if there was an easier way to automatically obtain the latest log file for ArcGIS Server, Portal for ArcGIS, or ArcGIS Data Store, and add to a ZIP file ready to share with whomever has asked for it?

Introducing, the suitably named, “ExtractLogFiles.ps1”!

Wait, what is “ExtractLogFiles.ps1“?

Written in PowerShell, the script has been designed to be run on a Windows machine running one or multiple base ArcGIS Enterprise components, automatically locate the most recent log file for one or more components, and to add them to a ZIP file in the current (or a user-specified) directory.

How does it work?

At a high-level, the script:

  • Determines the output folder to add the ZIP file (whether default or user-specified)
  • For each component specified, find the most recent log file by:
    • Using environment variables, determining the install location for the component
    • Finding a specific settings file which specifies the log file directory for the component
    • Access the log file directory and selecting the most recently written file
  • Copies the log file into the output folder for each component specified
  • Once all log files have been copied, add all logging files to a ZIP file, and delete the log file copies

How can I run it?

Download the following file to your local machine.

You can also read through the file in GitHub using the following link.

There are a number of parameters that are available to specify when running the script, such as:

.\ExtractLogFiles.ps1 -help -server -portal -datastore -datastorefolder "server" -destination "C:\Temp"

Running the script with “-help” provides details about each parameter and its usage.

For example (#1):

./ExtractLogFiles.ps1 -server -portal

Will copy most recent log file for Server and Portal and add to an output ZIP file within the current directory.

For example (#2):

./ExtractLogFiles.ps1 -server -portal -datastore -destination 'C:\Temp'

Will copy most recent log file for Server, Portal, and DataStore (from the folder specified during script execution) and add to an output ZIP file within the ‘C:\temp’ directory.

Because the “server” directory was specified for Data Store, the log file will also have “server” as its name. Something to look out for.


Potential issues you might run into

It’s at this time I would like to stress that this has been something written over a period of free time, and does not constitute ‘official’ software in any way that will be supported. Running this on any machine, in any environment, carries its own risk. It has been tested on Windows 10 machines running either ArcGIS Enterprise 10.9 and 10.8.1, but your mileage may vary.

The most common issue I anticipate running into will be when initially trying to run the script, due to the ExecutionPolicy setting in your organisation.

This is an error message you will receive if the ExecutionPolicy is set to “Restricted” .

The following documentation provides an overview of ExecutionPolicy in relation to PowerShell and steps on how to adjust. Please consider your organizations IT policies and standards before changing a setting such as this.

In addition, depending on error messages that appear during execution, some other things to potentially check are:

  • The account running the script needing appropriate permissions to the directory that the log files are in
  • The account running the script needing appropriate permissions to the output directory the ZIP file will be created in

Okay I have the logs, now what?

If you wanted to take a look at the logs yourself to understand what might be occurring within your environment, the following documentation pages may be helpful in building some initial context around logs in ArcGIS Enterprise:

Thanks for reading!

If you made it to the end, thank you for reading. Hope the above helps you when trying to search for log file directories, log files themselves, or has made you aware about the power of PowerShell!

Stay tuned for additional articles around how we can combine PowerShell scripting and ArcGIS Enterprise logging!

Got something to say?