Showing posts with label GP Task. Show all posts
Showing posts with label GP Task. Show all posts

Monday, December 22, 2014

Converting PDF to TIFF (10.3) using ArcPy

New at 10.3, is a handy tool many of us in the GIS world have wanted for a long time.  That is converting geo-referenced PDFs to Tiff files.
The help describes this tool as:
Exports an existing PDF file to a Tagged Image File Format (TIFF). If the PDF has georeference information, the TIFF can be a GeoTIFF. These TIFFs can be used as a source for heads-up digitizing and viewing in ArcMap. Both GeoPDF and ISO standards of georeferenced PDFs are supported. 

It is very straight forward to use:

import arcpy
arcpy.PDFToTIFF_conversion(in_pdf_file="C:/temp/sample.pdf", 
                           out_tiff_file="C:/temp/sample.tif", 
                           pdf_password="", 
                           pdf_page_number="1", 
                           pdf_map="Layers", 
                           clip_option="NO_CLIP", 
                           resolution="250", 
                           color_mode="RGB_TRUE_COLOR", 
                           tiff_compression="LZW", 
                           geotiff_tags="GEOTIFF_TAGS")



For this sample, I just created a map with one of the AGOL imagery layers in a blank ArcMap layout and exported it to PDF with the GeoReference information embedded.  This creates a TIFF that can be used for additional spatial data in the ArcMap session.

Enjoy,

A

Wednesday, May 29, 2013

Understanding the Density Tool Outputs

Straight from the Esri blog on analysis, is a great article talking about the density tool output.

It's worth a read, and it can be found here.

Enjoy

Thursday, March 28, 2013

Support GUI Design in ArcGIS for Desktop

Please support this idea of having GUI designer in python built in with python add-ins.

It can be found here: http://ideas.arcgis.com/ideaView?id=087E00000004SmHIAU

Thanks everyone.

Monday, February 4, 2013

Convert Table or Feature class to CSV (10.1)

Sometimes you need to export data from a feature class or table to a CSV file.  CSV stands for comma separated values.  Wikipedia defines a CSV file as such:

A comma-separated values (CSV) file stores tabular data (numbers and text) in plain-text form. Plain text means that the file is a sequence of characters, with no data that has to be interpreted instead, as binary numbers. A CSV file consists of any number of records, separated by line breaks of some kind; each record consists of fields, separated by some other character or string, most commonly a literal comma or tab. Usually, all records have an identical sequence of fields.

To create a CSV file at 10.1, you need to strip out certain field types: Geometry, Blob and Raster because the file format only supports plain text.  This can be done by doing the following:

 fieldnames = [f.name for f in desc.fields if f.type not in ["Geometry", "Raster", "Blob"]]

Next you need to write the field names and rows to a file.  This is extremely easy with the CSV library in python.  Documentation about this standard library can be found here.


def get_rows(data_set, fields):
   with da.SearchCursor(data_set, fields) as cursor:
      for row in cursor:
         yield row
if __name__ == "__main__":
   data_set = arcpy.GetParameterAsText(0) # feature class/Table
   output = arcpy.GetParameterAsText(1) # csv file
   desc = arcpy.Describe(data_set)
   fieldnames = [f.name for f in desc.fields if f.type not in ["Geometry", "Raster", "Blob"]]
   rows = get_rows(data_set, fieldnames)
   with open(output,'wb') as out_file:
      out_writer = csv.writer(out_file)
      out_writer.writerow(fieldnames)
      for row in rows:
         out_writer.writerow(row)

Here we have a function called get_rows() which takes two parameters.  The first is the data_set, which can be a table or feature class.  The next is the fields.  At 10.1, you must define your fields unlike the 10.0 cursor objects.  The function uses the yield, which is a generator.  Basically the code only runs if the function is called in a loop (I know that's not 100% correct), but here is a better explanation.  Using the CSV module in python, we can then easily write out each row within the rows generator object.

FYI, this is written so you can put this code into a script and toolbox for ArcGIS 10.1.  Just add the imports.

Enjoy

Friday, July 20, 2012

The Tile Package

What is it?
New at 10.1, tile packages are compressed files that contain a map document's cached data.  The tile package or .tpk is ideal for disconnected use and for sharing information to ArcGIS Online.
Why Should I Care?
It makes sharing cache easy, and you can create custom caches on the fly.  From a geoprocessing/application view of things, this means you can get your data out to mobile users without needing an air card.  It also produces an easy way to share cache from server to server or AGOL.

Other reasons include:
  • Improved rendering performance
  • Improved quality
  • Follows industry standards (AGOL, Google, Bing, etc..)

You can create a package using geoprocessing as follows:
import os
import arcpy
from arcpy import env
# Set environment settings
env.overwriteOutput = True
env.workspace = "C:/Tilepackages/"
# Loop through the workspace, find all the mxds and create a tile package using the same name as the mxd
for mxd in arcpy.ListFiles("*.mxd"):
    print "Packaging " + mxd
    arcpy.CreateMapTilePackage_management(mxd, "ONLINE",
                                       os.path.splitext(mxd)[0] + '.tpk',   "PNG8", "10")
Pretty easy to create.  It should be noted that your ArcMap Document's extent determines the area to be processed.  You can embed web service layers as well as local data.  The cached is fused together and it means you cannot query the base data as well.

Have packaging


Friday, May 18, 2012

Uncompressing Multiple File Geodatabases

Often when I receive spatial data from individuals on a DVD or CD, it is in the form of a file geodatabase and it's compressed.  If all the data is in a single file geodatabase, it is not a big problem, but when you get multiple file geodatabases, that's when I break out my python.

Here is a helpful little script that will try to uncompress all file geodatabases in a single folder.  It should be noted that if a file geodatabase is already uncompressed, it will just be ignored.


import arcpy
from arcpy import env
if __name__ == '__main__':
    try:
        workspace = arcpy.GetParameterAsText(0)
        env.workspace = workspace
        fgdbs = arcpy.ListWorkspaces("*","FileGDB")
        for fgdb in fgdbs:
            arcpy.UncompressFileGeodatabaseData_management(fgdb)
        env.workspace = None
        arcpy.SetParameterAsText(1,True)
    except:
        arcpy.AddError(str(arcpy.GetMessages(2)))
        arcpy.SetParameterAsText(1,False)
So what I have done is list all file geodatabases in a given workspace (folder), then I just use the standard UncompressFileGeodatabaseData_management().  When I'm done processing, I set my environmental workspace variable to None just to clean up some loose ends.  I could delete the object like fgdbs, but since they are self contained within the function, those variables should not persist beyond the life of the function.

Enjoy

Friday, November 4, 2011

Parallel Python with GIS

I've begun to dive into Parallel Python to see if I can reduce processing times on long running tasks by dividing the workload among a cluster of computers. 

I've already run into some problems:
  • Spatial data is not serializable
  • Lack of good documentation from Parallel Python
  • No upload data/download results function if you use files.
  • Servers time out
  • Tasks randomly restart
You'll have to have arcpy installed on all the machines you are performing the cluster computing with.
Now that you know that, you can get started easy as this:

ppservers = ('xx.xxx.xx.xx:8080',)
job_server = pp.Server(ppservers=ppservers,ncpus=0)

You set ncpus = 0 inorder prevent processes from being used locally. To submit a job:

libs = ("arcpy",)

job_server.submit(function,# function to perform
                 (variables,), # function variable
                 (),# call back function
                 libs # modules used by function
)
job_server.wait() # waits for the job to complete
job_server.print_stats() # print some stats about the server and task
del job_server


It's that simple to run the task.

Enjoy

Tuesday, March 23, 2010

Using Kernel Density

Has anyone ever been able to get the kernel density function to work in a python script?  It appears that the function does not honor the inputs, but I could be mistaken.  Post some code if you got it working.

The script I'm using is published to ArcGIS Server, and the kernel density if performed from the ArcToolbox, it returns the proper image, while the python call returns a black or blank tile, depending on how ArcGIS Server want to behave that minute. 

It's very frustrating because no errors are returned, and both the script and arctoolbox function work in desktop.

Monday, August 24, 2009

Serialize Anything!

When you start with threading, especially in AGX 900 sdk, you are going to need to translate your objects into strings. To do this, just use the two functions below.



private byte[] getByteArrayWithObject( Object o )
{
/*

1) Create a new MemoryStream class with the CanWrite property set to true
(should be by default, using the default constructor).

2) Create a new instance of the BinaryFormatter class.

3) Pass the MemoryStream instance and your object to be serialized to the
Serialize method of the BinaryFormatter class.

4) Call the ToArray method on the MemoryStream class to get a byte array
with the serialized data.

*/


MemoryStream ms = new MemoryStream();
BinaryFormatter bf1 = new BinaryFormatter();
bf1.Serialize( ms, o );
return ms.ToArray();
}
private object getObjectWithByteArray( byte[] theByteArray )
{
MemoryStream ms = new MemoryStream( theByteArray );
BinaryFormatter bf1 = new BinaryFormatter();
ms.Position = 0;

return bf1.Deserialize( ms );
}



You can then translate your byte[] into a string using the built in Convert with .Net. Now you have values passed into the AGX BackGroundThread.

Enjoy

Wednesday, July 15, 2009

List of All GeoProcessing Data Types

From the resource center of ESRI, comes a pdf that contains all the different type of GeoProcessing data types with correlating ArcObjects in it. This is very helpful, and any .NET GP developer should have it printed out and right next to them.

Enjoy

Thursday, July 9, 2009

2 GP Functions That Make Life Easier

Here are two functions that make life easier for custom GP development. The find parameter value by the parameter name, return parameter type IGPValue. The second method is get parameter by name, which returns IGPParameter3. Parameter value by name allows you to retrieve

Get Parameter By Name

public IGPParameter3 GetParameterByName( IArray paramvalues, string name )
{
IGPParameter3 gpParameter;
for (int i = 0; i < paramvalues.Count; i++)
{
gpParameter = (IGPParameter3)paramvalues.get_Element( i );
if (gpParameter.Name.Equals( name, StringComparison.OrdinalIgnoreCase ))
return gpParameter;
}
return null;
}


Get Parameter Value By Name

public IGPValue GetParameterValueByName( IArray paramvalues, string name )
{
IGPUtilities2 gpUtils = new GPUtilitiesClass();
IGPParameter3 gpParameter;

for (int i = 0; i < paramvalues.Count; i++)
{
gpParameter = (IGPParameter3)paramvalues.get_Element( i );
if (gpParameter.Name.ToUpper() == name.ToUpper())
return gpUtils.UnpackGPValue( gpParameter );
}
return null;
}

Wednesday, July 1, 2009

Creating a Feature Set via C#

A very powerful model builder feature is the create variable option. This will allow users to create almost any type of user input desire. The feature set variable is a very helpful variable when a user needs to interact with the map or recordset. The feature set allows users to draw points, polylines, or polygons on the fly. To do this in C#, you need to use the IGPRecordSetLayerClass(), but the process of making the interactive part work isn't that straight forward.

To get the interactive part of the Feature set to work, you must define the Schema with either a layer file or feature class in model builder. When you take a look at the ArcObject description of a parameter, you notice a pointer called Schema, but this isn't where you define the template data for a GPFeatureRecordSet, you define it in Value().


inputParameter.DataType = new GPFeatureRecordSetLayerTypeClass();
//
// create the GP Feature Recordset Layer Object
// from template data
//
IFeatureClass FC = (IFeatureClass)m_GPUtilities.OpenDatasetFromLocation( @"