<?xml version="1.0" encoding="UTF-8"?><metadata>
<Esri>
<CreaDate>20191107</CreaDate>
<CreaTime>11350800</CreaTime>
<ArcGISFormat>1.0</ArcGISFormat>
<SyncOnce>TRUE</SyncOnce>
<ModDate>20220915</ModDate>
<ModTime>110335</ModTime>
</Esri>
<dataIdInfo>
<idCitation>
<resTitle>DetectObjectsUsingDeepLearning</resTitle>
<date>
<createDate>20191107</createDate>
</date>
</idCitation>
<idAbs>
<para>In a raster analysis deployment, this tool runs a trained deep learning model on an input raster to produce a feature class containing the objects it identifies. The feature class can be shared as a hosted feature layer in your portal. The features can be bounding boxes or polygons around the objects found, or points at the centers of the objects.</para>
</idAbs>
<descKeys KeyTypCd="005">
<keyTyp>
<keyTyp>005</keyTyp>
</keyTyp>
<keyword>D</keyword>
<keyword>CNTK</keyword>
<keyword>deep learning model</keyword>
<keyword>detect objects</keyword>
<keyword>raster analysis</keyword>
<keyword>TensorFlow</keyword>
<keyword>training</keyword>
</descKeys>
</dataIdInfo>
<distInfo>
<distributor>
<distorFormat>
<formatName>ArcToolbox Tool</formatName>
</distorFormat>
</distributor>
</distInfo>
<mdDateSt>20191108</mdDateSt>
<mdContact>
<rpOrgName>Environmental Systems Research Institute, Inc. (Esri)</rpOrgName>
<rpCntInfo>
<cntAddress>
<delPoint>380 New York Street</delPoint>
<city>Redlands</city>
<adminArea>California</adminArea>
<postCode>92373-8100</postCode>
<eMailAdd>info@esri.com</eMailAdd>
<country>United States</country>
</cntAddress>
<cntPhone>
<voiceNum>909-793-2853</voiceNum>
<faxNum>909-793-5953</faxNum>
</cntPhone>
<cntOnlineRes>
<linkage>http://www.esri.com</linkage>
</cntOnlineRes>
</rpCntInfo>
<role>
<RoleCd>007</RoleCd>
</role>
</mdContact>
<tool displayname="DetectObjectsUsingDeepLearning" name="DetectObjectsUsingDeepLearning" softwarerestriction="none" toolboxalias="rasteranalytics">
<summary>
<para>In a raster analysis deployment, this tool runs a trained deep learning model on an input raster to produce a feature class containing the objects it identifies. The feature class can be shared as a hosted feature layer in your portal. The features can be bounding boxes or polygons around the objects found, or points at the centers of the objects.</para>
</summary>
<alink_name>DetectObjectsUsingDeepLearning_ra</alink_name>
<toolIllust alt="Detect Objects Using Deep Learning tool illustration" src="withheld" type="dialog"/>
<toolIllust alt="Detect Objects Using Deep Learning tool illustration" src="withheld" type="illustration"/>
<parameters>
<param datatype="String" direction="Input" displayname="inputRaster" expression="inputRaster" name="inputRaster" sync="true" type="Required">
<pythonReference>
<para>The input image used to detect objects. It can be an image service URL, a raster layer, an image service, a map server layer, or an internet tiled layer.</para>
</pythonReference>
<dialogReference>
<para>The input image used to detect objects. It can be an image service URL, a raster layer, an image service, a map server layer, or an internet tiled layer.</para>
</dialogReference>
</param>
<param datatype="String" direction="Input" displayname="outputObjects" expression="outputObjects" name="outputObjects" type="Required"/>
<param datatype="String" direction="Input" displayname="model" expression="model" name="model" type="Required"/>
<param datatype="String" direction="Input" displayname="modelArguments" expression="{modelArguments}" name="modelArguments" sync="true" type="Optional">
<pythonReference>
<para>The function model arguments are defined in the Python raster function class referenced by the input model. This is where you list additional deep learning parameters and arguments for experiments and refinement, such as a confidence threshold for fine tuning the sensitivity. The names of the arguments are populated by the tool from reading the Python module on the RA server.</para>
</pythonReference>
<dialogReference>
<para>The function model arguments are defined in the Python raster function class referenced by the input model. This is where you list additional deep learning parameters and arguments for experiments and refinement, such as a confidence threshold for fine tuning the sensitivity. The names of the arguments are populated by the tool from reading the Python module on the RA server.</para>
</dialogReference>
</param>
<param datatype="Boolean" direction="Input" displayname="runNMS" expression="{runNMS}" name="runNMS" sync="true" type="Optional">
<pythonReference>
<para>Specifies whether non maximum suppression, where duplicate objects are identified and the duplicate feature with a lower confidence value is removed, will be performed.</para>
<bulletList>
<bullet_item>NO_NMS—All detected objects will be in the output feature class. This is the default.</bullet_item>
<bullet_item>NMS—Duplicate detected objects will be removed.</bullet_item>
</bulletList>
</pythonReference>
<dialogReference>
<para>
Specifies whether non maximum suppression, where duplicate objects are identified and the duplicate feature with a lower confidence value is removed, will be performed.
<bulletList>
<bullet_item>Unchecked—All detected objects will be in the output feature class. This is the default.</bullet_item>
<bullet_item>Checked— Duplicate detected objects will be removed.</bullet_item>
</bulletList>
</para>
</dialogReference>
</param>
<param datatype="String" direction="Input" displayname="confidenceScoreField" expression="{confidenceScoreField}" name="confidenceScoreField" sync="true" type="Optional">
<pythonReference>
<para>The field in the feature service that contains the confidence scores that will be used as output by the object detection method.</para>
<para>This parameter is required when the NMS keyword is used for the runNMS parameter.</para>
</pythonReference>
<dialogReference>
<para>The field in the feature service that contains the confidence scores that will be used as output by the object detection method.</para>
<para>This parameter is required when the Non Maximum Suppression parameter is checked.</para>
</dialogReference>
</param>
<param datatype="String" direction="Input" displayname="classValueField" expression="{classValueField}" name="classValueField" sync="true" type="Optional">
<pythonReference>
<para>The name of the class value field in the feature service. </para>
<para>If a field name is not specified, the tool will attempt to use a Classvalue or Value field. If these fields do not exist, the tool will identify all records as belonging to one class.</para>
</pythonReference>
<dialogReference>
<para>The name of the class value field in the feature service. </para>
<para>If a field name is not specified, the tool will attempt to use a Classvalue or Value field. If these fields do not exist, the tool will identify all records as belonging to one class.</para>
</dialogReference>
</param>
<param datatype="Double" direction="Input" displayname="maxOverlapRatio" expression="{maxOverlapRatio}" name="maxOverlapRatio" sync="true" type="Optional">
<pythonReference>
<para>The maximum overlap ratio for two overlapping features, which is defined as the ratio of intersection area over union area. The default is 0.</para>
</pythonReference>
<dialogReference>
<para>The maximum overlap ratio for two overlapping features, which is defined as the ratio of intersection area over union area. The default is 0.</para>
</dialogReference>
</param>
<param datatype="Boolean" direction="Input" displayname="processAllRasterItems" expression="{processAllRasterItems}" name="processAllRasterItems" type="Optional"/>
<param datatype="String" direction="Input" displayname="context" expression="{context}" name="context" type="Optional"/>
<param datatype="String" direction="Input" displayname="outputClassifiedRaster" expression="{outputClassifiedRaster}" name="outputClassifiedRaster" type="Optional"/>
</parameters>
<returnvalues/>
<environments>
<environment label="Cell size" name="cellSize"/>
<environment label="Output extent" name="extent"/>
<environment label="Output coordinate system" name="outputCoordinateSystem"/>
<environment label="ParallelProcessingFactor" name="parallelProcessingFactor"/>
<environment label="processor type" name="processorType"/>
</environments>
<usage>
<bullet_item>
<para>Your raster analysis (RA) server Python environment must be configured with the proper deep learning framework Python API such as Tensorflow, CNTK, or similar.</para>
</bullet_item>
<bullet_item>
<para>With the tool running, your RA server calls a third-party deep learning Python API (such as TensorFlow or CNTK) and uses the specified Python raster function to process each raster tile.</para>
</bullet_item>
<bullet_item>
<para>The Input Model parameter will only use a deep learning package (.dlpk) item from the portal.</para>
</bullet_item>
<bullet_item>
<para>After the Input Model is selected or specified, the tool will obtain the model arguments information from your raster analysis server. The tool may fail to obtain such information if your input model is invalid or your raster analysis server isn’t properly configured with the deep learning framework.</para>
</bullet_item>
<bullet_item>
<para>Use the Non Maximum Suppression parameter to identify and remove duplicate features from the object detection.</para>
</bullet_item>
<bullet_item>
<para>For more information about deep learning, see Deep learning in ArcGIS Pro.</para>
</bullet_item>
</usage>
<scriptExamples>
<scriptExample>
<title>DetectObjectsUsingDeepLearning_ra example 1 (Python window)</title>
<para>This example creates a hosted feature layer in your portal based on object detection using the DetectObjectsUsingDeepLearning tool.</para>
<code xml:space="preserve">import arcpy
arcpy.DetectObjectsUsingDeepLearning_ra(
"https://myserver/rest/services/Farm/ImageServer",
"https://myportal/sharing/rest/content/items/itemId", "detectedTrees",
"score_threshold 0.6;padding 0", "NO_NMS")
</code>
</scriptExample>
<scriptExample>
<title>DetectObjectsUsingDeepLearning example 2 (stand-alone script)</title>
<para>This example creates a hosted feature layer in your portal based on object detection using the DetectObjectsUsingDeepLearning tool.</para>
<code xml:space="preserve">#---------------------------------------------------------------------------
# Name: DetectObjectsUsingDeepLearning_example02.py
# Requirements: ArcGIS Image Server
# Import system modules
import arcpy
# Set local variables
inImage = "https://myserver/rest/services/coconutFarmImage/ImageServer"
inModel = "https://myportal/sharing/rest/content/items/itemId"
outName = "detectedTrees"
modelArgs = "score_threshold 0.6;padding 0"
runNMS = "NMS"
confScoreField = "Confidence"
classVField = "Class"
maxOverlapRatio = 0.15 # Execute Detect Objects Using raster analysis tool
arcpy.DetectObjectsUsingDeepLearning_ra(inImage, inModel, outName, modelArgs,
runNMS, confScoreField, ClassVField, maxOverlapRatio)
</code>
</scriptExample>
</scriptExamples>
<shortdesc>ArcGIS geoprocessing tool that runs a trained deep learning model on an input raster to produce a feature class containing the objects it finds.</shortdesc>
<arcToolboxHelpPath>withheld</arcToolboxHelpPath>
</tool>
</metadata>
