Configuring Python and GeoDjango for Amazon Web Services Deployment

Quick run down of configuring Geodjango to run on amazon web service’s Elastic Beanstalk (EBS). This assumes you have a separate database instance up and running and that you’ve setup the project to use Elastic Beanstalk (i.e. eb init). Configuration files are broken into the tasks they handle and are read automatically by Elastic Beanstalk.

  1. Create two directories in the root of the project (.ebextensions).
  2. Under .ebextensions add the following files:
    • commands.config – Executes shell commands to deploy static files and migrate database changes. Enables the enterprise linux yum repository so that you can install the GIS libraries through yum (very cool).
    • django.config – default settings for Elastic Beanstalk.
    • environment.config – Setup any environmental variables that you want.
    • packages.config – Tells EBS what packages to install.
  3. commands.config:
    command: "source /opt/python/run/venv/bin/activate && python collectstatic --noinput"
    command: "source /opt/python/run/venv/bin/activate && python migrate"
    leader_only: true
    command: "sudo yum-config-manager --enable epel"
  4. django.config:
    WSGIPath: kyp_backend/
  5. environment.config
    - option_name: DB_NAME
    value: [db_name]
    - option_name: DB_USERNAME
    value: [username]
    - option_name: DB_PASSWORD
    value: [password]
    - option_name: DB_HOSTNAME
    value: [amazon host name]
    - option_name: DB_PORT
    value: [port]
    - option_name: HOST_ENVIRONMENT
    value: [development/test/production]

  6. packages.config:
    postgresql94-devel: []
    geos: []
    binutils: []
    gdal-devel: []
    proj: []
    gdal: []

OGR/GDAL Examples

Generate a shapefile from a postgresql table
ogr2ogr -f “ESRI Shapefile” [Shapefile Name].shp PG:”dbname='[DB Name]’ host='[Host Name]’ port='[Port Number]’ user='[User Name]'” [Table Name]

Format Types

  • ESRI Shapefile
  • MapInfo File
  • S57
  • DGN
  • Memory
  • BNA
  • CSV
  • GML
  • GPX
  • KML
  • GeoJSON
  • Interlis 1
  • Interlis 2
  • GMT
  • GPKG
  • SQLite
  • ODBC
  • WAsP
  • MSSQLSpatial
  • PostgreSQL
  • DXF
  • Geoconcept
  • GeoRSS
  • GPSTrackMaker
  • PGDump
  • GPSBabel
  • SOSI
  • GFT
  • GME
  • CouchDB
  • ODS
  • XLSX
  • ElasticSearch

Postgresql/Postgis Cheatsheet

Export table data (data only):
pg_dump –column-inserts -a -h [host name] -U [user] -d [database] -t [table] > [output directory]

Export table data and table:
pg_dump –column-inserts -h [host name] -U [user] -d [database] -t [table] > [output directory]

Execute script:
psql -h [host] -U [user] -d [database] -f [input file]

Planted Firebreaks and Fallow Field Followup

Lessons learned from planting firebreaks are fairly small but important to prevent wasted effort and money. Firstly the seeding failed on both Wet and dry sites, but for different reasons. On the dry sites it was too exposed to the sun causing the soil to be too dry. Soil fertility was also in question. With more effort and a drought tolerant seed mix, this could likely be overcome. Our solution is to let natural warm season grasses establish themselves and keep the firebreaks mowed.

Xeric Site Growth
Xeric Site Growth

In the wet sites, in our bottom lands, the clover and small grains were successful. Unfortunately they succumbed to competition from fescue grass. With more regular
Mowing this could have been delayed but we failed to give it the attention needed.

Mesic Site Growth
Mesic Site Growth

As far as firebreak widths, two tractor widths is too much for our properties. I’n flat untried agricultural land this is possible but in the upper piedmont of the Appalachians we have trees and small pastures that make a wide firebreak unrealistic. Our firebreaks range in width from one to one and a half tractor widths and seems plenty sufficient.

Disking in a Firebreak
Disking in a Firebreak

In summary, allow warm season grasses to grow on drier sites. On moist sites a mix of rye, wheat, red clover, and a white clover (Dutch or ladino) should work well. Finally keep firebreaks mowed to reduce competition and to keep them clearly marked. Plus, disking the same area is a lot easier and you’ll run into fewer rocks as well!

Here are the results of the fallow field as it grew in:

Fallow Field A
Fallow Field A
Fallow Field B
Fallow Field B

Prescribed Burning for Early Oak Successional Habitat

After a few years of burning wildlife openings we were ready for our first woods burn. We were aiming for a low intensity fire that would clean up the under growth and increase herbaceous vegetation in a ten acre southwest facing stand. This would be our test plot, which was chosen because the aspect and elevation meant that it had fire adapted species. A moist north facing slope would contain more fire intolerant species like poplar and black cherry. This stand mostly consisted of w well spaced mature pines and oak. Lastly, the terrain and tree density , unlike other stands permitted the development of fire breaks.

Stand before burning
Stand before burning

The process began with marking the proposed fire breaks by walking skidded trails and old road beds and then tying them together avoiding drainages as much as possible. These were mapped, as are all our other land management features using the geographic information systems data model we developed to handle land management, Geoforst.

Burn Map
Map of burn planning features

Next, the firebreaks were created by first bush hogging the path and then alternately using a field plow and a disk harrow to ensure that a good barrier to fire was created. Getting off the tractor and moving logs and rocks is necessary.

Video of Disking Fire Breaks

We then contacted the state forestry commission and scheduled to meet with a forester in charge of prescribed burning. He stopped by and discussed goals, strategy, and looked over the fire breaks. We arranged for them to be back in a week when the weather was good and they had available time.

To protect the hardwoods, we walked the stand marking crop trees. In this case, crop trees were undamaged hardwoods with good form. Since we knew that the fire would be low intensity we did not expect any mortality of the mature oaks, but we did not want to run the risk of fire scarring them. The mature pines are very tolerant of fire and we plan to harvest them in the next ten years and were therefore re no concern.

crop trees
Crop trees marked with green flagging

To ensure no damage to the crop trees we did two things. First we removed all brush from the trunks of the trees. Secondly we raked the leaf matter from around the trunks. We felt this was important since it was the first time these woods had been burned in at least fifty years. An alternate method used a backpack leaf blower which works well purportedly. This was not available so we went with the rake.

At this point we were truly ready and were now waiting on the weather. The day came and the forestry commission showed up with three men and a bulldozer just in case. We wanted all the protection we could be since the we really didn’t know what we were in for. Conditions were mild and the fire was very manageable. Watching how things were done and asking lots of questions has us very confident that we can perform the next burn without assistance.

Low intensity burn
Low intensity burn, just like we planned for

The next stage will be monitoring the results. Since this was a very low intensity fire, there will likely need to be several more in close succession to accomplish our goals. Ideally we’d like to stimulate oak regeneration, but literature on the subject seems contradictory. In fact there are a fair number of foresters that still believe that fire should be excluded from hardwood stands. By culling the Virginia pine and implementing a fire regime, I believe this can be achieved while simutaneously opening the understory to herbaceous vegetation. Our biggest challenge will be over powering the Virginia pine seedlings. At any rate studies show that fires were frequent in the Southern Appalachians, even in hardwood stands. Let’s s e what happens.

The Fire–Oak Literature of
Eastern North America:
Synthesis and Guidelines

SDE Examples for Oracle

Export from database to SDE export file
sdeexport -l [SOURCE_TABLE],[GEOMETRY FIELD] -f .\[OUTPUT FILE NAME].sdx -i sde:oracle11g -u [USERNAME] -p [PASSWORD]@[TNS NAME]

Export from database to shapefile
sde2shp -o init -l [SOURCE_TABLE],[GEOMETRY FIELD] -f [OUTPUT FILE NAME].shp -t polygonZ -a all -i sde:oracle11g –

Import to database from SDE export file
sdeimport -o CREATE -t [OUTPUT_TABLE_NAME] -f [INPUT_FILE_NAME] -g RTREE -k [DBTUNE KEYWORD] -i sde:oracle11g -u [USER] -p [PASSWORD]@[TNS NAME]

Delete table from database including SDO/SDE metadata.
sdetable -o delete -t [TABLE_NAME] -u [USER] -p [PASSWORD]@[TNS NAME] -i sde:oracle11g

Drop and Create Spatial Index


Oracle Spatial Examples

Get project information (note table alias required).
select c.[GEOMETRY FIELD].sdo_srid
from [TABLE] c;

Get the geometry type
select c.[GEOMETRY_FIELD].sdo_gtype
from [TABLE_NAME] c;

Interpret results with:

Ensure table is registered in Oracle Spatial

select *
from mdsys.sdo_geom_metadata_table;

Get spatial tolerance of table
SELECT sdo_dimname, sdo_tolerance
select DIMINFO
from user_sdo_geom_metadata
WHERE table_name = ‘[TABLE_NAME]’
) ;

Create point geometry from x/y values
2001, — point data
mdsys.SDO_point_type([X Ordinate], [Y Ordinate], NULL),
from dual;

Create Spatial Index
CREATE INDEX “[Index Name]” ON “[Table Name]” (“[Geometry Field]”)

Add Geometry Metadata


Determine Spatial Relationship Between Geometries
, sdo_geom.relate(
, ‘determine’
, b.[Geometry_Field]
, [Tolerance]
[Table_Name_1] a, [Table_Name_2] b;

Creating Fallow Field Wildlife Openings with Planted Firebreaks

A result of the thinning was six wildlife openings that we had marked for total harvest during the tree marking phase of the project. It is important to note that a wildlife opening, at least a maintainable one, is not created solely from the cutting of trees. In fact, that is just the beginning.

The process started with a literature review on wildlife openings and habitat development. The most thorough and applicable document found was an excellent paper by the Virginia Department of Game and Inland Fisheries Managing Habitats for Ruffed Grouse in the Central and Southern Appalachians. Although it is focused on grouse, it presents practical and scientifically based forestry practices that promotes the acceleration and spatial compression of natural processes to benefit game while keeping them viable in both and economic and labor senses. I can’t say enough about this paper. Read it, read it again, and keep revisiting it as it is information dense.

In a nutshell, this is how we transformed a cut-over area into a wildlife opening:

Freshly cleared opening.

You can see in the photo above that it’s going to take more than spreading seed to make a permanent opening from this. To accomplish this, we first burned the slash. This unfortunately cost us some of the trees surrounding the opening, the heat was blistering.

Following this, we cleared the remaining slash and prepared the opening for planting. This required cutting the remaining stumps flush so that we would be able to navigate the openings with the tractor. Regular burning of the wildlife openings is anticipated, but it is mandatory that we can control them by mechanical means as well, via rotary cutter.

Cleaned opening

After cleaning we were ready to put in the planted firebreaks. The route we chose for most of the openings was to create a firebreak 10 meters from the edge of the woods that was two tractor widths wide which we would plant with a mixture of wheat and white clover. The interior of the opening would remain as a fallow field, which we would burn every few years.

To accomplish this, we marked the edge of the firebreak with red tree marking paint, directly on the ground and plowed in the two strips. This required cleaning the plow often and moving left over rocks and debris out of the way.

Plowed in firebreak

For this we used a standard chisel plow.

Once the ground had been broken up, rock, roots and debris cleared from the plowed strips, a disc harrow was used prepare for planting. This worked moderately well, but for a proper job it requires a lot of time and a bulldozer would be handy for the stumps.
Planting was done by calculating the plowed area and seed was measured using a bathroom scale. The seed mixture was distributed using a hand spreader which affords much more control than a tractor mounted spreader.

Finally, a piece of chain link fence was dragged behind an ATV to cover the seed and the rest we left up to the weather.

Here is a little bit on the results of this:
Planted Firebreaks and Fallow Field Followup

SQL Spatial Examples

Posting these here for my own reference, but use them yourself if you’d like.

Select Well Known Text from Geometry
select [Geometry Column].STAsText() from [Table];

Create WKT from lat/long fields, good for insert/update statements
select GEOGRAPHY::STGEomFromText(‘POINT (‘ + convert(varchar(20),[LONGITUDE]) + ‘ ‘ + convert(varchar(20),[LATITUDE]) +’)’, 4269) from [TABLE];

Intersection Example:
SELECT g.* , g.[Geometry Field].STAsText() FROM [Schema].[Table] g WHERE [Geometry Field].STIntersects(geometry::STGeomFromText(‘POLYGON ((-131.232584635417 55.4096875, -131.230598958333 55.4096875, -131.230598958333 55.4100455729167, -131.232584635417 55.4100455729167, -131.232584635417 55.4096875))’, [EPSG ID])) = 1

Get Area of Intersection
from [TABLE_NAME_1] t2,

Create Spatial Index:
CREATE SPATIAL INDEX [sidx_tablename_geom] ON [Schema].[Table]
[Geometry Field]
BOUNDING_BOX =(-173.116318241729, 52.7677349180939, -129.473430300096, 71.3903800225212), GRIDS =(LEVEL_1 = MEDIUM,LEVEL_2 = MEDIUM,LEVEL_3 = MEDIUM,LEVEL_4 = MEDIUM),

Get Bounding Box of Feature Class:
SELECT [Geometry Field].STEnvelope() as envelope from [Schema].[Table]
SELECT envelope.STPointN(1) as point from ENVELOPE
UNION ALL select envelope.STPointN(3) from ENVELOPE

MIN(point.STX) as MinX,
MIN(point.STY) as MinY,
MAX(point.STX) as MaxX,
MAX(point.STY) as MaxY

Use GDAL to Insert Data from Shapefile to MS SQL:
ogr2ogr -f MSSQLSpatial “MSSQL:server=[Server Name];database=[DB Name];trusted_connection=yes” “…\[Shapefile Name].shp”

SRID from geometry field
select [Geometry Field Name].STSrid FROM [TABLE NAME];

Using a Power Mac G5 as a ZFS Time Machine Backup Server

If you are considering setting up a home network file server using Zettabyte File Server (ZFS) technology the Power Mac G5 should be given strong consideration. For one, these machines are inexpensive, they have room for lots of disks (see G5 Jive), they are beautiful, and the network natively with your other Mac hardware using AFP.

Documentation on using OS X as a ZFS file server and Time Machine backup location is scarce. After reading about using OpenSolaris for this purpose Time Machine Volumes with ZFS and AFP and some serious trial and error I have a working implementation. To help others from struggling through it, here are my notes and I hope you find them useful.

To get started make sure you have installed MacZFS and followed the MacZFS Getting Started instructions to create a pool. How one creates a ZFS pool is highly variable, situation dependent, and very well documented, I can’t do a better job so I won’t repeat it.

Here are the baseline ZFS pools for reference, before creating a new example pool:

zpool status

pool: mercury
state: ONLINE
scrub: none requested

mercury ONLINE 0 0 0
disk0s2 ONLINE 0 0 0

errors: No known data errors

zfs list

mercury 338G 2.36T 172G /Volumes/mercury
mercury/Music 27.6G 2.36T 27.6G /Users/torsten/Music
mercury/tm-quercus 136G 364G 136G /Volumes/tm-quercus
mercury/tm-smorin 2.86G 247G 2.86G /Volumes/tm-smorin

Okay now let’s make a new pool and ready it for sharing:

-- Create that pool!
zfs create mercury/backups

-- Lets limit the backup pool to 500 GB.
zfs set quota=500G mercury/backups

chmod -R 770 /Volumes/mercury/backups/

-- Change the owner to the backup user.
chown -R torsten /Volumes/mercury/backups/

-- Tell Time Machine to recognize this as a supported device.
touch /Volumes/mercury/backups/

So far things are pretty similar to how you’d do this on OpenSolaris, let’s take a look:

zfs list

mercury 338G 2.36T 172G /Volumes/mercury
mercury/Music 27.6G 2.36T 27.6G /Users/torsten/Music
mercury/backups 23.5K 500G 23.5K /Volumes/mercury/backups -- Our new pool!
mercury/tm-quercus 136G 364G 136G /Volumes/tm-quercus
mercury/tm-smorin 2.86G 247G 2.86G /Volumes/tm-smorin

Now we need to trick OS X into sharing our ZFS pool. Here we create a new directory in the pool. This needs to be done because, oddly, Finder will see the new pool as the parent pool.

mkdir /Volumes/mercury/backups/time-back

Since we are running OS X Panther (version 10.5), System Preferences won’t display many of the directories in the root of the OS. So, to make this new filesystem accessible to GUI we need to make a symlink in the user’s home directory.

ln -s /Volumes/mercury/backups/time-back ~/time-back

Now we are ready to share! Go to System Preferences > Sharing and add the newly created directory as a share.
Sharing pane in System Preferences

Let’s see what Time Machine thinks. First let’s mount the drive onto the Mac laptop (mine runs Lion 10.7). Open Finder and press Apple-K to connect to a server. Then type afp://[Server Name or IP]/[Share Name] like so:

Finally, open Time Machine Preferences and select the backup disk

View of Time Machine with the ZFS share visible.

** If the drive does not appear in Time Machine Preferences do this on the client (in this case the laptop):

defaults write TMShowUnsupportedNetworkVolumes 1

If all went well you’re now storing your backups on ZFS to a remote server!

Adding the Spatialite GeoServer Plugin on OS X

Start by going to page and reading through their documentation. It didn’t quite work out for me, so I thought I’d be more specific and share with you the steps I took to make this work.

1) Get the plugin, the url given in the Geoserver manual is incorrect, as the plugin is not in the trunk of the project. Instead use the following command and/or url to download the plugin –

curl --O

2) Unzip the archive file –


3) Copy both the jar files to Geoserver lib –

cp gt-jdbc-spatialite-2.7-SNAPSHOT.jar /Applications/

cp sqlite-jdbc-spatialite-3.7.2-1.jar cp gt-jdbc-spatialite-2.7-SNAPSHOT.jar /Applications/

4) Install geos and proj-
sudo port install geos
sudo port install proj

5) Startup/restart Geoserver.

6) Add a new store and you should see Spatialite as an option.

Getting a list of Installed Features from Server08 using PS

This is great for getting a list of installed features from a windows server. So far I’ve only tried do this locally aside from a quick test on a Windows 7 machine, from which it failed.

In powershell:
IMPORT-Module servermanager


Fix Orphaned Users SQL2008 R2

General pattern for this common problem following a database restore. Note that the syntax has changed from SQL2005.

SQL 2008 R2 Syntax::

USE <database_name>;
sp_change_users_login @Action='Report';
sp_change_users_login @Action='update_one', @UserNamePattern='',

SQL 2005 Syntax:

USE <database_name>;
EXEC sp_change_users_login 'Report';
EXEC sp_change_users_login 'Auto_Fix', 'user';

On the Road to Timbuktu, Dakar to Bamako

“Hey guys” shouted Matt Kerr, “I think that might have been the border back there” as we sped across a bridge. Cyr and I couldn’t hear what he said, since we were in the back seat of the truck, so we just ignored him. Just after the bridge we stopped, the driver got out, and ran to a control point and we were not to follow. Matt said “are you sure that wasn’t the border?” “No way man, you don’t just go driving through borders accidentally. Trust us, you’ll know when we hit the border” we said.

Sure enough, we continued on for a while and hit what looked like a border and it seemed we’d be there for a while. There was some customs work that needed to be done on the Toyota Hilux we bought a ride in and paperwork in Africa is not taken lightly. It was shipped from Europe to Dakar with the intention of selling it in Mali for a profit. While we waited, we helped Frenchman with a DVD player he had just installed in his Mercedes. He was very thankful and after chatting a bit with him Andrew asked if this was the border. “No” was the reply. “The border was several kilometers the way you came. You are in Mali now”.

Fantastic, we’d crossed the border illegally and were now stuck in no man’s land for the second time on the trip. We were already pushing our luck since two of us (not me) didn’t have visas and hoped to get them on the border. Not having an exit stamp from Senegal was not going to make that any easier. We also didn’t relish the idea of going back to Senegal and begging for forgiveness. They probably had a warrant out for our arrest. That too, would be less than pleasant.

Our new French friend offered to help us since we helped him get his DVD player running and back we went into what could only be a bad situation. It felt like going to the dentist office. The best you can hope for is that nothing terribly horrible will happen to you and so it was here, it felt like we were already on our way to jail. We rode to the border outpost and prepared ourselves for another onslaught. Sure enough, after the formalities were done, one of the guards lost his cool and from what I picked up in French, he was not happy. Our new friend did a fantastic job sticking up for us and gave the guard a full on tongue lashing, but alas we were sent to the police station to have a “word” with the head honcho. I focused on the bright side and figured that by spending a night or two in jail we’d have no problems keeping on budget.

Termite Mound through Truck WIndow

So there they were intently playing Scrabble and not too interested in breaking it up to deal with us. One of them finally moved into the office and we told them only that we needed to fill out the exit paperwork. He moved agonizingly slowly and I new that things would get much more complicated if the border guard decided to make an appearance…which of course he did. We were almost done with the paperwork when up pulls our buddy the border guard and starts making his way into the office. I positioned myself so that I could clearly see the border guard getting closer and the police commandant filling the paperwork. Just as the guard made it to the steps the commandant lifted the stamp and hovered it over the passports. “Come on, come on! Stamp damned passports for the love of god!” I screamed internally. But no, his hovering cost us the battle and the guard was happy to interrupt the process. So very, very close to freedom!

We had some explaining to do and after some apologies they let us through. At this point we figured out where the Senegal border was and we concluded that we’d better start working on the whole Mali thing pronto. By now it was dark and we’d been at the border for hours. We walked down the road and found someone who looked official. He asked us right away if we had visas, we said we needed two. “Ah” he said merrily, “that is an infraction”. Now Matt was very scared that he’d be rejected at the border and have to repeat the trip we’d just done (more on that later). He doesn’t know French, but he did pick up on the word infraction. I assured him that everything was alright, but he certainly didn’t believe me. In fact he was almost panicked. As expected we got around the infraction by a judicious use of cash. The guard was good natured about it and Andrew and I expected it, so it wasn’t a big deal.

Baobab Tree

Fourteen hours earlier we left Dakar at night after striking a deal with some guys in a truck stop. We were told to meet them there with roughly 20,000 cfa each and they’d take us all the way to Bamako. We handed over the cash and instead of hopping into a truck they loaded us into a 4×4 Toyota, things were looking good. We shot into the busy streets of Dakar, hit traffic, and before long it was dark. I never asked, but I don’t think I was the only who wondered if we were really headed to Mali.

Matt had had a long trip to Dakar and the jet lag helped him fall asleep in the front seat. Not far from the out skirts of Dakar the tarmac turned not to dirt, but more like to a crater field. It was so rough our heads regularly hit the truck’s ceiling, while wearing our seat belts. Many painful hours later we pulled into a dirt lot in a town full of people for some rest. Looking back the scene reminds me of pictures I’d seen of Mogadishu. I was concerned for my safety, but I was so beat up and tired I got out of the car and draped myself over the fully loaded truck bed. Though engines, tires, and the like normally don’t make for comfortable sleeping, I was out of commission in seconds.

Donkey with Cart

It wasn’t long until I was rattled awake by the driver and we were back on the god forsaken road. 14 hours into the trip the sun was coming up, revealing a dirt two track next to the paved road. I looked at it enviously, and we finally turned on to it finding it just as glorious as I expected. Suddenly we were passing cows, 7-foot tall termite mounds, all the while racing down a dirt track – everything that makes Africa … well, Africa.

I thought we had it bad, but then we started passing the buses we considered taking from Dakar. It should suffice it to say that, of the many we passed, none were operational any longer. At one of stops Matt got out and I said “hey Matt, why the hell is your arm bleeding?”. He responded, “oh I was resting my head on my arm when I fell asleep last night “. How he slept through that ride is a mystery and how he slept while his head bashed a hole in his arm is even more confounding.

Matt and the Hilux

We had several more good experiences on our way most of which were the result of pulling over so our driver could nap while we wandered around meeting folks and we did eventually make it to Bamako and no, our driver did not take us there. Instead he kindly dropped us off at the side of the road and got us a cab for the small ride to the capital of Mali. So my fears were realized, but it could have been worse – much worse.

Sun Rise


Here is a post that I hope will help some poor fool save a lot of time.

This is the situation I found myself in:
Sandbox environment:
1) Within Visual Studio, Built and executed the SSIS project pointed at the sandbox database – no errors.
2) Within Visual Studio, Built and executed the SSIS project pointed at the user acceptance testing database – no errors.

User acceptance environment:
1) Deployed package to Integration Services and validated package without execution – no errors.
2) Executed deployed package, received a DTS_E_PRODUCTLEVELTOLOW error on the first component.

After reading forums and many failed attempts to resolve the problem I found that SSIS attempts to run on the computer on which you are using management studio. That means that if you are executing a package on a remote sql server from a machine that doesn’t have MSSQL installed it will fail. Instead, execute the package from a job on the remote SQL Server, install MSSQL on your execution platform, or install management studio on the remote SQL Server and execute the package from there.


Automating Database Deployments from Subversion

There was a bit of a learning curve going from manually deployed SQL statements and SSIS packages to automatic deployments, but this is what I did and it seems to work so far.

1. Create a database project in visual studio and setup projects to your liking making sure it builds and deploys correctly, a chore in itself.

2. Get SLIK Subversion command line tools for Windows ( and install. The windows equivalent of $PATH is updated following a restart of your workstation so that you can execute the SVN command line commands without including the path to the executable. In short, restart your workstation!

3. Create a daily build directory to which you can check out from SVN.

4. Write batch script to checkout project from SVN. Here’s an example:
svn export –force https://[subversion url]/project “[build directory]”


5. Write a batch script to build the database project using MSBUILD. Note that the executable is not in the environmental path variable, so the executable and its path must be fully qualified. Here’s an example:
C:\WINDOWS\Microsoft.NET\Framework\v4.0.30319\msbuild /target:BUILD ./[path]/[project file]


6. Write a batch script to deploy the database project using OSQL. Here’s an example:
OSQL -E -S[server]\[instance] -i .\[path]\[SQL Script].sql


7. If it both batch files test correctly, then merge them into a single batch file and make it nice by adding logging and parameterize paths and server names.

8. Run the process by manually until you are confident in it, then execute regularly through Windows Scheduler.

Extra credit: consider how more straight forward this would be on any *nix system.


  • Surveying is Spelled with an ‘S’ Right?

    Working on the land led me back to surveying, dragging me back in inch by inch. We’re about to start thinning our timber stands and having the property lines neatly mapped in my head doesn’t help a logger. I told our forester “I can mark our boundaries, no problem”. Afterall, I walk the property lines every year, marking and GPSing the monuments and corners, so how much time could it take?

    It was easy going where the corners were close, but the longer courses (>1000′) were hard to mark with the rolling topography in dense woods limiting the line of sight to 20′ and sometimes less. “No problem”, I thought, I’ll just pull out a handy pocket transit and back it up with a surveyor’s compass and GPS. After all, I had the plat map with the bearings and I could even pull out the theodolite if I have to. Doing this without a rod man would take a little more time running back and forth, but all I needed was a straight line between two pins.

    Several days of work into this and I’m triple checking my work. Why? Well, because things don’t jive. Our neighbor’s logging operations seem to be offset (both ways) to our property lines. My courses don’t intersect as well as I want them to, no matter how many times I run them and evidence of recent surveys don’t make sense (their lines aren’t even cut straight.)

    I thought that using multiple lines of evidence (old fence lines, surveyor’s compass, transit, old flagging, overgrown roads, GPS, sighting by naked eye, and dead-reckoning) would tease out the real boundaries but everything seems slightly and equally off. Since I’m not getting the degree of consensus I wanted, I’m taking it a step further. I’m going to determine the change in magnetic declination from the original survey (circa 1992), using COGO (coordinate geometry) in ArcGIS, and then applying correction factor to all 59 bearings. Then I’m going back out there to see if things make more sense, but first I have to learn ArcGIS COGO and surveying again. What’s the difference between an azimuth and a bearing again? Oh man yeah, I’ll be starting from scratch and COGO in ArcGIS? Yikes, I’m scared….