Omnio Localia

Here's Destiny

Human Connection

Have you ever noticed the joy that comes when you spend time with your closest family and friends? Those closest relationships that are immune to the effects of distance, separation, or a ticking clock. Perhaps you’ve even felt that same connection to a beloved pet or a particular part of the world: a grandmother’s house, a nearby pond, or perhaps the hum of a bustling city. In this age of increasing technology many have suggested that our ability to relate is deteriorating. What I find interesting is that our greatest technological innovations are created to improve communication. The telegraph, the internet, the cell phone; all of these were invented so that we could stay in touch. Our society is becoming increasingly mobile as many of us move around the world to find jobs and a purpose far from our birthplace. These technologies exists to prevent the fraying or cessation of our relational bonds. I am inclined to disagree with the belief that our relationships are worth less than they used to be. I admit that I have several tertiary friendships as a result of technology, but I think I have many closer friends as a result of our technology than I ever would have made without it.

e·ther, noun
archaic definition:
a very rarefied and highly elastic substance believed to permeate all space.

I just finished reading Margaret Wheatley’s book Turning to One Another: Simple Conversations to Restore Hope to the Future. I believe most of the book reflects a heartfelt desire on the part of the author to encourage her readers to connect with those around them. She presents a view that we are all connected, that this connection is built in our relationships, and that our relationships are what enable us to change the world for good and protect it from evil. I was recently asked to help my church plan out its future. The method of church is changing because many people don’t feel connected to it. I have also had many discussions with friends that have suggested that our existing church model isn’t working to connect congregants in a spiritual sense.

What might the future church look like?
          What might the future of christendom look like?
What about simple spirituality?

Connections imply linkages between two objects that do not inhabit the same space. Each of us as humans occupy a different space. All of our perspectives are slightly different. None of us can see the world exactly as any other one of us can. In this sense we are all unique, different, and possess a perspective on life than no other human can have. We will not always agree and the point of conversation isn’t to agree, it’s to convey learning. We are ultimately diverse and intricately connected. Each of us has a view to share and each of us has a duty to listen.

-Austin
32.3° N by 90.8° W

Processing The Ecological Systems of the United States Raster In Chunks

Ecological Systems of the United States by NatureServe

NatureServe’s Ecological Systems of the United States data layer is a spatial *.tif file representing a high resolution ecological land classification. Its primary function is to denote terrestrial systems using a method that can be applied consistently across the contiguous United States. I’ve been using it for several different types of analyses as an underlying data analytics layer, especially to solve complex problems in predictive spatial ecology. In my thesis research I derived a series of Landscape Diversity datasets as tertiary products to my main research. This winter I began thinking about these tertiary products in a little more depth and decided to do some analyses with them.

As a pythonista, I really like to handle datasets in numpy arrays. Unfortunately this data layer is on the order of 2.7 GB which is not trivial to deal with. Numpy spits out MemoryErrors rather than attempt to be useful for datasets this large. This is no fault of numpy, for it can’t operate outside the bounds of its own programming. I also like operating using distributed computing solutions, and processing this dataset is not going to be an exception. Below, I have outlined the code I am using to parse this huge raster layer into smaller *.npy chunks for use in further analytical processing.

First, I load the needed python site-packages. I am using the gdal module to open the raster because I need the extra practice using open source GIS rather than another swig of the Esri kool-aid (even if it is very tasty). Since python 3.4 just issued a final release, I’m using that here too! I defined a simple function just to test out some function annotation that I find amazing!

from osgeo import gdal
import itertools as it
import numpy as np
import os

def make(directory_path: str) -> str:
        if not os.path.isdir(directory_path):
                os.makedirs(directory_path)

        return directory_path

I then create an output directory for all of my smaller data arrays.


output_directories_for_arrays = make(r'/home/user/NatureServe National Vegetation Map/smaller_arrays')

Then I get around to opening the actual dataset; keep in mind that the full dataset isn’t loaded here.
Once I open it, I access the first band, retrieve that band’s dimensions, and print those dimensions for informative purposes.

ds = gdal.Open(r'/home/NatureServe National Vegetation Map/l48_eslf_v2_8a.tif', gdal.GA_ReadOnly)
rc = 1 # ds.RasterCount in only 1

band = ds.GetRasterBand(rc)
cols, rows = band.XSize, band.YSize

print 'x-size: ', cols
print 'y-size: ', rows

I decide I want as many 4000 x 4000 cell array chunks as I can extract from the dataset. These smaller chunks will be used in my analyses.

Because the data is too large to load as a single array, I use the ReadAsArray() method to read in a block at a time. I also add some logic so that my code won’t error-out in locations where a complete 4000 x 4000 array can’t be populated (i.e. edges). I also add one final check to ensure that the array that I am selecting isn’t comprised entirely of -32768 null values.


size = 4000
for i, j in it.product(xrange(0, cols, size), xrange(0, rows, size)):
	sizex, sizey = size, size

	if cols - i < size:
    	sizex = cols - i

	if rows - j < size:
    	sizey = rows - j

	val = band.ReadAsArray(i, j, sizex, sizey)
	if val.sum() != (-32768 * size * size):
    	        np.save(os.path.join(output_directories_for_arrays, '{0}x{1}'.format(i, j)), val)

It really isn’t all that hard to do. Now I have a directory full of smaller, more manageable, numpy arrays that I can load, distribute, and process on multiple machines for further analyses.

-Austin
32.3° N by 90.8° W

Leaflet.js, HTML5 localStorage, and FeatureGroups

I just started a new project that involves building a map application using the leaflet.js platform. While the details of the complete project are irrelevant to this discussion, I require a system that can remember where drawn features are drawn on a leaflet.js map, and reconstruct those features when the page was reloaded. I did not want to employ any server-side logging either. The leaflet.draw.js plug-in was used to employ the drawing logic required by this system. Users of this plug-in are aware that drawn features (e.g. polylines) are added to an instance of FeatureGroup. A FeatureGroup is a collection of layers; in this case, drawn items. For my purposes all of my items are polylines.

I thought storing the FeatureGroup in HTML5 localStorage would be a trivial task, but I was confronted with an error. FeatureGroup contains “cyclic object values” which prevent the JSON.strigify() from executing properly. JSON.stringify(object) should be called on the object before being stored in localStorage.

My workaround was to use cycle.js. I should note my use of the jQuery framework as well. I use it to create a save button and determine if the page has loaded among other things.

var items = new L.FeatureGroup(); // holds the drawn polylines
// the save-button saves any drawn polylines to localStorage
$("#save-button")
    .button()
    .click(function(click_save_button_event){
        // I use decycle() from cycle.js
        var storage_obj = JSON.stringify(JSON.decycle(items));
        localStorage.setItem('items', storage_obj);
    });
// then on the page load I will load any features from
// localStorage
$(function(){
    var stored = JSON.parse(localStorage.getItem('items'));
    if (stored != null && stored.length != 0){
        stored = JSON.retrocycle(stored);
        for(var i in stored._layers){
             var item = stored._layers[i];
             var lyr = L.polyline(item._latlngs,{
                                  color:'#22CCCC',
                                  weight: 4});
             items.addLayer(lyr);
         };
    }
});

There we have it. A way to save and load drawn polylines on a leaflet.js map using HTML5 localStorage.

-Austin
32.3° N by 90.8° W

The Founder

Today I’d like to honor the late Dr. Roger Tomlinson. Dr. Tomlinson was the Father of GIS. He invented the first GIS system in Canada by instantiating land records in a computer system. His work laid the foundation for one of the fastest growing industries in the world. We as Geographers owe him a great debt of gratitude for bringing our discipline into the 21st century and giving it a new purpose. He died on Sunday, February 9, 2014 at the ripe old age of 80.

-Austin
32.3° N by 90.8° W

New Place

Buying a home is a stressful process. Determining how much you can afford, how much you want to pay, where you want to live, where you can live, what features you want, what financing you qualify for, and the sundry of other questions can be quite mind-boggling. Luckily my wife and I had an excellent realtor (Re/Max) and mortgage company (Quicken Loans) to walk us through the process. This weekend concludes the first week living in the new home and it has been wonderful. Previously we were living in an apartment that had lots of interesting features including rat infestations, mildew, a busted stove-top, poor insulation, and a back-porch balcony that I liked to call the suicide deck (in reference to its solid stability). While that apartment will always be our first home as a married couple, I’m glad we’re moving on.

-Austin
32.3° N by 90.8° W

American Dream

Earlier this week my wife and I went refrigerator shopping. It felt like a very adult thing to do. My co-worker said I wasn’t a senior citizen yet, because that happens when you go shopping for refrigerator parts. This very adult activity led me to wonder if I’ve arrived yet. Have I made it? The answer to this question is related to defining what the it is. Just because I can name it the “American Dream”, doesn’t mean I have a clue what that’s supposed to mean.

What is the American Dream? Own a house? Own a car? Live care-free as a trust-fund baby? Run a boutique shop in a quaint downtown?

Perhaps a better question is: what is the real America? This question was asked and investigated from a spatial perspective by Kevin Kelly. His thesis is an interesting look at the discrepancy between the society, culture, and place we live; and the way we think it should be–or are told it should be. The American Dream is barely more than a continuation of what the Real America is. It is the culmination of political puffery designed to encourage belief in our collective identity. This in and of itself may seem cynical or apathetic; but in fact, it is vitally important we believe it. It is important we believe in a dream, it is a unifying ideal that defines what we hope for our nation. The puffery matters…so, no matter what, We Win!.

The American Dream is the dream of an individual, a single solitary individual with an idea. The dream is an idea of her future life, his future success, or their collective wellbeing. It is really just the wish of the one. So many things are done in the name of this vague dream. There is a collective idea of what the Dream entails; generally all the good things about life, none of the bad. Included in the roots of our dream are the last vestiges of Manifest Destiny, that we are the Chosen Ones, the City on a Shining Hill, the Greatest Nation on Earth. The dream is the drive to be as excellent as we can be, despite ourselves. It is the wish we have for our children, our own lives, and our friendships. It is the unshakable belief that when I buy a refrigerator I can stock it full of BBQ and Beer. It is coming to terms with who you are now and where you want to be, living with a refrigerator you can depend on.

Pax Americana

-Austin
32.3° N by 90.8° W

IIS 7 and Python 2.7 for CGI

Today I tried to get a python Common Gateway Interface (cgi) script running with IIS 7 on a Windows 2008 R2 Server. I used the python sockets site-package included in the default python distribution during my graduate work for network communication, but I haven’t used python as a cgi language for a traditional website. In one of my recent projects I needed a backend system that could do simple GIS calculations (e.g. clip, buffer) without using any existing GIS software–no ArcGIS Server, no open-source playtoys, nothing, nada, zilch. To solve my problem I resorted to Python–the slithering snake of success. To learn a little more about IIS 7 and python as a cgi language I chose not to use a sockets approach. I ensured cgi was an installed Windows feature on the operating system and followed the instructions on nathanaa5′s blog. I created an application in IIS 7 and a handler for python using *.py as the Request Path, C:\Python27\python.exe %S %S as the Executable, and python as the Name.

I then created a python script to test my new python cgi backend and saved it as a *.py file in my application folder previously defined in IIS 7.

#!/usr/bin/env python
# -*- coding: UTF-8 -*-

print 'Status: 200 OK'
print 'Content-Type: text/html'
print
print 'Python is working as a cgi...'

I pointed my browser to http://localhost:80/cgi-bin/app.py and was greeted by an HTTP Error 502.2 – Bad Gateway. It said I had malformed http headers–how unfortunate. A few google searches later and it was apparent that many pythonistas encounter this problem. I fiddled around with IIS 7 for awhile before I stumbled upon this fix. In the python file handler in IIS 7, change the Executable to C:\Python27\python.exe -u “%s”. Then ensure that you do in fact have good HTTP headers in your python script and refresh your browser. It worked for me!

Blogging

The last time I had a blog I had more time to actually work on it. This time, I’m having trouble deciding what the best post frequency is. A couple of the posts I’ve made were obligatory remarks, blurbs, and verbiage intended only to force myself to be a consistent daily author. Daily updates are very hard to stay on top of. I’m not sure how frequently I will post from now on. Perhaps two or three times per week? I know I want to be at least somewhat regular in my posting. I guess we’ll just have to see how it goes.

-Austin
32.3° N by 90.8° W

Mr. Sentdex

Over the holidays I started getting restless and started looking at different methods for processing and displaying financial data. I thought it would provide a little decent practice with python’s matplotlib site-package. Every time I start new programming hobbies I like to do a little youtube and google research. I found sentdex’s youtube channel to be an extremely helpful starting point. His channel discusses different stock technical analysis and display techniques using python. Being a pythonista myself, it was fun to improve upon some of what he did. In his youtube videos he uses text files or real-time pulls of stock data from yahoo’s financial api. Instead of using text files I setup a postgresql database to store the data in. I then piped this data into the python data analysis library’s DataFrame object for further processing.

I am able to connect to a postgresql database using psycopg2, instantiate a cursor, and then pull down the data table. I load the rows of data using the .from_records() method of the DataFrame object. I add the column names for the DataFrame and then sort the data by time to be used as an index.

df = pd.DataFrame.from_records(data=rows, 
               columns=['trade_date', 'adj_close'],
               index=pd.to_datetime(trade_date))
df.sort_index(ascending=False, inplace=True)

It then becomes easy to use the DataFrame as a basis for more high performance analysis. This is a great way to empower yourself to conduct your own analysis instead of relying on other people’s work. Consider then, the following graphic produced in a process similar to the one introduced by Sentdex on his channel:

consider this sample data for demonstration purposes only.

Consider this sample data for demonstration purposes only.

Even though this project isn’t very large, I rely on data systems like postgresql and data structures like pandas.DataFrame so that when I’m ready to scale up–I can do so easily.

Happy New Year!

Follow

Get every new post delivered to your Inbox.

Join 282 other followers