Reading and writing BLOB data has always been an issue previous to 10.1, but now it's not a problem. A reader pointed out that fact the other day, so I thought why not point this out to everyone.
From the help (source: resourcebeta.arcgis.com):
A BLOB is data stored as a long sequence of binary numbers. ArcGIS stores annotation and dimensions as BLOBs, and items such as images, multimedia, or bits of code can be stored in this type of field. You can use a cursor to load or view the contents of a BLOB field.
In Python, BLOB fields can accept strings, bytearray, and memoryviews. When reading BLOB fields, a memoryview object is returned.
So what does this mean to you. We'll it's time to store those documents, pictures from vacation, and videos with geo tags in a spatial database.
Writing Blobs :
from arcpy import da
myFile = open(r"c:\temp\image.jpg",'rb').read()
with da.InsertCursor(r"c:\temp\demo.gdb\table",['blobFieldname','fileName']) as cursor:
cursor.insertRow([myFile,'image.jpg'])
Reading Blobs:
from arcpy import da
import os
with da.SearchCursor(r"c:\temp\demo.gdb\table",['blobFieldname','fileName']) as cursor:
for row in cursor:
binaryRep = row[0]
fileName = row[1]
# save to disk
open(r"c:\saveFolder" + os.sep + fileName, 'wb').write(binaryRep.tobytes())
del row
del binaryRep
del fileName
Writing and reading Blob fields involves using the python built in open(). This function can open up files in various modes, but ensure that the method used are the binary modes 'rb' (read binary) and 'wb' (write binary), or else this process will not work. You can essentially open up any file as a Blob, so stick whatever you want in there, but realize that this can balloon the size of your database up greatly. It does however open up a whole new host of ways of transporting file based data. Off the top of my head, I can image how this could improve data transport for local parallel processing methods of spatial and non-spatial data.
Enjoy