Golden Mummies of Egypt

One year ago, we had our last cultural visit before Buffalo closed down for Covid-19. Our destination, the Buffalo Museum of Science, to see the Golden Mummies of Egypt. These are not mummies from the time of pyramids. But instead, more modern mummies from the time of Alexander the Great through the later Roman rule. A period of 600 years (300 BCE to 300 CE with Cleopatra and Mark Anthony at mid-point) when Greeks and Romans or their surrogates ruled.

The Egyptians decorated these mummy cases with encaustic paintings of beeswax and pigment. The Greeks developed this decorative technique for their ships. They waterproofed their ships with wax and resin. By adding coloring to the mix, they created encaustic decoration. Next time you imagine the ships Helen launched, think color.

I used my then-new iPhone 11 pro with portrait mode to capture the exhibition’s treasures. I am uncertain how the phone’s camera works. But it appears to track its subject and integrate the image, removing effects of the photographer’s shaking hand. It did a fantastic job of capturing images of the exhibition’s objects in low light and behind glass. Here are the results.

Mythical Lucca Labyrinths

San Martino Cathedral & Labyrinth

Labyrinth carved on a pillar of the portico of Lucca Cathedral, Tuscany, Italy. The Latin inscription says “HIC QUEM CRETICUS EDIT. DAEDALUS EST LABERINTHUS . DE QUO NULLUS VADERE . QUIVIT QUI FUIT INTUS . NI THESEUS GRATIS ADRIANE . STAMINE JUTUS”, i.e. “This is the labyrinth built by Dedalus of Crete; all who entered therein were lost, save Theseus, thanks to Ariadne’s thread.”

Photos and quote by: Myrabella / Wikimedia Commons, CC BY-SA 3.0

For years I thought of labyrinths as simple objects. If one followed the path, the designer presented no choices. Eventually, one would exit or die in the mouth of a monster. The outcome was not your choice but the intention of the designer of the labyrinth.

Then in the fall of 2018, my attitude changed. While visiting Lucca, I had a wild thought. I would create mythical labyrinths of Lucca. They would be of the same species but different individuals.

What is the genome of the Lucca labyrinth? I had no idea. How would I design mythical descendants? Again, I had no idea. However, I did know how to start the design of a computer program. First, I needed a vocabulary to describe the program. Graph theory, a branch of mathematics that describes things (called vertices or nodes) and their relationships (called edges or links), provided that vocabulary. The following figure is a simple graph.

Figure By::AzaToth , Public Domain, Commons Wikimedia

If node one is a labyrinth entrance and node six the exit. Then "walking" 1-5-4-6 is one of many paths. However, for my mythical labyrinths, I need a Hamilton path. Hamilton paths are the result of self-avoiding walks that visit all nodes. For this graph, 1-5-2-3-4-6 is a Hamilton path.

My design uses a graph that maps each location (the nodes) to neighboring nodes (using edges). The program then iterates randomly over Hamilton paths through the nodes. It halts when it finds a mythical Lucca labyrinth.

I plan two more posts about this work.

  • The next will describe the method of iterating over possible paths.

  • The final post will describe the halting condition. That is the definition of mythical Lucca labyrinths.

The remainder of this post addresses the graph design and its relationship with the drawing of the labyrinth.

Returning from Lucca, I printed an drawing of the labyrinth. Taking a red magic marker, I attempted to discover the form of a graph and its path, which could describe both the Lucca labyrinth and its mythical siblings. The first and unmistakable set of nodes follows.

The resultant graph and path, in red, follow.

The top row of the graph, row 0, describes the central ring of the labyrinth. The bottom row, 10, maps the outer ring of the labyrinth. The exit from the labyrinth’s central court, column 0, is to the left and the path to the exterior is to the right. The red Hamilton path describes the route of the labyrinth.


This eleven by ten rectangular graph might satisfy my needs. However, it has a fatal flaw. In every row, the path connects the odd columns 1, 3, … 11 to the even column to their right. My mythical labyrinths will also require these connections. Without all of these connections, some of the sweeping arcs will be missing. These arcs are a chief feature of the Lucca labyrinth and, therefore, also required for my myths. Using this Graph, the search space for my mythical labyrinths is very substantial. The program run time will be significantly increased, probably beyond my lifespan and that of my computer.

Replacing each pair of nodes, which will always be connected, with a single node avoids checking 44 edges to ensure their presence. However, the new graph does not contain information on every turn. The labyrinth drawing software must cover this missing information. The following analogy illustrates my design. Think of the labyrinth interior columns as a snowboarder performing on four connected half-pipes. When our snowboarder enters the half-pipe, gravity and then inertia takes him with no alternatives to the opposite edge. Then if directed to the next column, the snowboarder exits the half-pipe. When remaining in the half-pipe, the snowboarder first turns 180 to the next row. Then once again, gravity and inertia drive our snowboarder back across the half-pipe.

The eleven by six graph and drawing for the Lucca labyrinth follow.

In closing, here are two of my mythical Lucca labyrinths.

A Body with Two Heads

Recently I discovered a new poet, Tishani Dosi. My reading of Tishani led to some of Homo sapiens’ oldest (6500 BCE) large sculptures.

The line which triggered my search left me puzzled " … I dreamed I had a body with two heads like those ancient figures from the Zarqa River … ".

I immediately Googled "figures" and "Zarqa." The most useful hit was a scholarly paper:

> Schmandt-Besserat, Denise. "’Ain Ghazal "Monumental" Figures." Bulletin of the American Schools of Oriental Research, no. 310 (1998): 1-17.

Luckily JSTOR.org allows 100 reads of papers during COVID. Seventeen pages later, I had a good idea of the statues’ construction – Plaster covered reed armatures. I also had the probable purpose for the statues. Exorcists used the figures in a ceremony for ridding a home of angry ghosts. However, the paper’s black and white photos were disappointing in the online reading tool.

My next Google added "Ain Ghazal" as a search term. This search yielded a Wikipedia article, "’Ain Ghazal Statues," which provided excellent photos.

All Photos by Osama Shukir Muhammed Amin FRCP(Glasg)

How did Tishani Dosi discover these statues? Did she visit a museum in Amman, Abu Dhabi, Paris, or London? Maybe Tishani read of Jacob seeing the face of God. Perhaps she wondered what else happened on the banks of the Jabbok (Zarqa).

Tishani Dosi is an author of both novels and poetry. Born in India, she writes in English. The poem which started this search is titled "Self" and is one of six in Granta 151. I plan to read Girls Are Coming Out of the Woods, her third book of poetry, next.

Matera, Basilicata or Lucania?

Did you ever read a book about a place, then later plan a trip there and fail to integrate those two activities in your mind?  Last year when we prepared for a trip to Matera in the Italian region of Basilicata, I never connected my memories of “Christ Stopped at Eboli” by Carl Levi with the journey. Levi’s book introduced me to the rugged, parched terrain of south-central Italy and the “feudal” organization of society there before World War II.  But that knowledge seemed to have had nothing to do with our travel south.

For the American ear, a more straightforward translation of  “Cristo si è fermato a Eboli” would be “Modern Western Civilization never proceeded South of Eboli.”  In the book, Carlo Levi’s banishment from his home in Piedmont to the village of Gagliano, Lucania happened because of opposition to the Fascists and the Abyssinian War.  Gagliano is a name only for the book.  The actual location of Levi’s banishment is Aliano fifty miles southwest of Matera.

Why do we today call this region Basilicata and Levi called it Lucania?  Lucania is a very ancient name for the area.  The  Lucani (Lucanians) ruled this region until conquered by the Romans during the second Punic war. The name that was good enough for the Romans was good enough for the Fascists in their recreation of the empire. The name Basilicata, the current name, comes from the period of Byzantine rule after the fall of the Western Roman Empire.

Three things stick in my mind as features of this world.  First is rugged terrain and the extreme vegetation which grows on this hard earth.

dsc_0851dsc_0916

The second memory is of Levi’s sister’s horror at the poverty she witnessed in Matera while visiting her brother.  Following winding mule paths into the Sassi, she saw peasants living in caves.  The caves housed both the families and their animals, people, pigs, mules, and chickens living in shared rooms.  The ceiling of each of these caves, with its stone facade, formed the street and floor of the cave above.

Version 2dsc_0991Version 2

My final memory from Levi’s book is the complete social separation between the professional and landowning class from the peasants.  The top society lived, with some leisure, comfortable lives. Every day peasants suffered on the hard land.

This hard life of the peasants started to change in the 1950s as the Italian government closed the slums of the Matera Sassi and moved the poor to modern apartments. Also, a new functioning social safety net guaranteed health care and eliminated starvation.

On our third and final day, we left the city of Matera and explored the much older ruins of houses and churches east of the ravines of the Sassi.  This area, the Parco della Murgia Materana,  witnesses 7,000 years of human society beginning with the Neolithic period and reaching its peak 1000 years ago.  After these early developments, civilization had moved to the location of modern Matera.

The park visitor center, Jazzo Gattini,  is a 200-year-old “sheepfold” were shepherds sheltered their flock for the night. The center provides tours, an educational facility for students of all ages, video facilities, maps, and food for guests.  Our experience here was way beyond expectations.  One of the English speaking members of the staff sat down with us and provided the history of the area and recommendations given my/our hiking capabilities (he gave us more credit for abilities than we deserve).

After watching several videos, we headed off to find the 7000-year-old Neolithic village of Murgia Timone.  In the first few steps, I made a navigational error as I walked east down a dirt road rather than along a footpath.   After covering a greater distance than indicated on the map and now headed southeast not east, lost, we turned left into a long driveway toward some older buildings.  Although the buildings were unused, there were several campers with recreational vehicles in the area.  Approaching the first vehicle, we were happy to determine the occupant spoke fluent English.  Unfortunately, he was of no help as he had just arrived but did recommend we try the older Italian man at the far end of the area. Michele, who only spoke Italian, was checking his beehives in the area.  He had known about the Neolithic village for fifty years but had never taken the time to see it.  He would be happy to take us if we waited for a few minutes, as he too wanted to visit.  I was surprised to find we were only five minutes away from the nearest dwelling.

Version 2

There were six homes, all very similar.  And stone curbed “streets” lead between the houses.  While walking past the caves, Michele pointed out the herbs growing wild at our feet.  He picked up sage, oregano, and thyme.  Crushed each in his hand and let us smell the great aromas.  Then he dropped then on the ground as it was illegal to carry them from the park.  My favorite herb was the wild saffron crocus.

Version 2

Usually, these flowers would bloom in the warm spring when his bees were very active. However in recent years, it was far to dry in the spring, and now the crocus bloomed in late October and November when it was too cold for his bees to fly.  Basilicata crocus honey will soon be another loss to climate change.

Leaving the Neolithic village, Michele offered to show us the Belvedere panoramic lookout across the ravine from the Sassi. There continuing our conversation, Michele described his childhood in the Sassi. Ten years after his birth, Michele’s family had been one of the those moved from the cave homes in the 1950s.

Version 2dsc_0033

On leaving the Belvedere, we asked Michele to pose for a photo with us.  Sadly he refused.  Our acquaintance was one of those exceptional unplanned surprises that seem to happen when traveling.

One thing which Levi’s book could not prepare me for was the stone churches (Le Chiese Rupestri) of Matera.  Some of these churches are older than a thousand years.  These are creations of negative architecture. The columns and arches which mimic those in churches made of quarried stone are what’s left after the rock was excavated. The earliest were dug (I almost wrote built), on the east side of the ravine by Byzantine monks in the eighth century. These monks were fleeing religious persecution for there creation of images.  Most of these images have been ravaged by time, water, and unfortunately collectors of artifacts.  However, what remains has a new beauty as the remnants of the art merge into the rock.

Version 2Version 2

This year, 2019, Matera has been selected as the European Capital of Culture.  Click here to learn more. For decades the Sassi had remained uninhabited. But today the former slums are populated with hotels, B&Bs, museums, and shops of all kinds as well as private homes. Luckily this restoration has preserved the beauty of this ancient city.

 

 

Predicting Football Playoff Scores

For or the last month I have been struggling to publish a credible machine learning prediction of NFL playoff scores. Why? Because I want to get my hands dirty with data analytics. Why football? Because politics and or social issues might lose friends, financial markets might lose friends’ money, and science stuff would be just too dry. Besides most of my friends have some interest in football.

Two weeks ago I published my predictions for the AFC and NFC Championship Games. Since that time I have been transferring my code to Google Colaboratory, so you can run it, and writing this blog post. Now it is the eve of Super Bowl LIII and time to wrap it up. My prediction? LA 29 NE 23. Would I bet on it? No way.

At this point in the post, I suggest you run my code and see all of this year’s playoff score predictions. Click on “Open in Colab” below. Using the “Runtime” menu, select “Run all.” Then scroll the right pane to the bottom and wait for the scores to compute.

Open In Colab

Am I pleased with my program? Yes, I set out to learn new skills, mission accomplished. Is it accurate? Sometimes. You will note that have I nailed both LA scores and missed badly on the low side with New England. I obviously have work to do.

I plan to continue improving the program over the next year. The effort to date has been a freewheeling hacker approach. I need to introduce “light” engineering formalism. Starting with testing after each change using past seasons. Something like, admit changes only it improves in three out of four seasons.

For those of you who have no interest in the details, this is a good point to leave.  I will post on this blog each time I learn major new concepts about machine learning or football,

The first step is always, where to get data?  Was this project going to end before it started? Luckily for NFL football game data, someone else has done most of the work.  Andrew Gallant’s (aka Burntsushi) NFLDB provides an interface from NFL game day on the web to a PostgreSQL database on my MacBook Pro.  The only downside to NFLDB is that it uses Python 2.7 and I am committed to Python 3.   So now I have Python 2.7 with the Burntsushi software, but this is not a great inconvenience since I only use it to update a PostgreSQL database.  Also, I added psycopg2, as an interface to PostgreSQL, to my Python 3.

What is the data provided through NFLDB?  A hierarchical set of data tables.  The highest level is  “game.”  It contains the teams, who’s home, the final and quarter scores.  The second level is “drive.”  Its contents include the drive start (time and yard-line), the end condition, e.g. (Touchdown, Field Goal, Punt, Interception, Fumble, etc.) along with the yard line,  the number of first downs, yards gained, penalty yards and elapsed time.  The drive data is the end of the data which I have used for this first version of my algorithm. Below the drive, there is “play,” “play-player” and “player” data. These tables are detailed at a sufficient level to call the game for a radio broadcast.  Who carried the ball, threw the pass, to which intended receiver, and on the defense, who made the tackle, and who assisted?  With NFLDB the answers to all of these questions are yours.

How does one predict a result (e.g. a Football score) from data?  Three different components must be chosen. A mathematical model for the result needs to be selected. This model can range from a linear equation to a many layer neural network. Secondly one develops a list of features which are chosen from or calculated from the data. Finally, a training algorithm or process is chosen. Another expression for the training algorithm is machine learning.

I am using is a simple linear equation to model football scores.

 Score_o= \omega_h * Home  + \sum_{n=1}^{k_p}\omega_n*O_n + \sum_{m=1}^{k_d}\omega_m*D_m

Where Score_o is the score fitted/predicted for the team,  Home is one if the team is home zero otherwise,   O_n are offensive features of the team, and  D_m are the defensive features of the opposing team. Finally, the \omegas are the weights fitted by the training algorithm.

Useful features are correlated with the football scores.  Also, they need to be much smaller in number than the data they will be fitted to.  The football regular season has 256 games yielding 512 team scores.  If we choose 512 features our fit to the model will only describe what has happened and have no predictive power.  I picked my feature set primarily as a set of probability estimates.  The table below describes the offensive features.   To be more correct, read probability as the probability estimate from the regular season.  I have also incorporated the possibility of measurement per play features, and have added one example “pyp” which is net penalty yards per play.

TurnoverProbability that an offensive drive results in a turnover.
safetyProbability that an offensive drive results in a safety.
TDle20Probability that an offensive drive, starting between own goalline and own 20, results in a Touchdown.
TDle40Probability that an offensive drive, starting between own 20 and own 40, results in a Touchdown.
TDle60Probability that an offensive drive, starting between own 40 and opposing 40. results in a Touchdown.
TDle80 Probability that an offensive drive, starting between opposing 40 and opposing 20. results in a Touchdown.
FGle20Probability that an offensive drive, starting between own goal-line and own 20, results in a Field Goal.
FGle40Probability that an offensive drive, starting between own 20 and own 40, results in a Field Goal.
FGle60Probability that an offensive drive, starting between own 40 and opposing 40, results in a Field Goal.
FGle80Probability that an offensive drive, starting between opposing 40 and opposing 20, results in a Field Goal.
RZProbability that an offensive drive which reaches the opposing 20 results in a Touchdown.
nfdProbability that an offensive drive, with no first downs, results in a Punt.
Ple20Probability that an offensive drive starts between own  goal-line and own 20.
Ple40Probability that an offensive drive starts between own  20 and own 40.
Ple60Probability that an offensive drive starts between own  40 and opposing 40.
Ple80Probability that an offensive drive starts between opposing  40 and opposing 20.
pypAverage yards lost per play as a result of a penalty.

What about defense?  I have used the same measures as the offensive measures.  Instead of the offensive team against all competitors, I calculated each parameter for all competitors against the defensive team.  Yes, the direction of goodness is reversed, but the model can handle that.  For example, a good defense will have a lower probability of allowing a touchdown in the Red Zone.  These new parameters use the same name with a preceding “D.”

To train my model, I use Bayesian Ridge Regression from Scikit-Learn.  Why?  Because I read somewhere, it minimizes the problems with multicolinearity and overfitting with inappropriate features.

If you have not run the code yet, I suggest you  click on “Open in Colab” below:

Open In Colab

I changed my approach to initialization of the enviornment from my previous Colaboratory notebooks.  This time i used %%Shell to write a shall script to load the data files.  On my own Mac I used SQL for my Pandas dataframes.  Saving off to “csv” files seems to be a good way to move to Colabortory.

First maze notebook

My love of programming mazes started with the purchase of “Mazes for Programmers: code your own twisty little passages,” see A-mazing books.  For almost two years I have often returned to mazes, programming just for fun.

This Colaboratory notebook cuts maze programming to its basics. To build the notebook, I have taken my current python maze packages and removed all code except for that required for this project.  By minimizing the code, I can show it all in the notebook; I have no hidden tricks.  My goal was to produce a maze puzzle, a perfect maze with only one solution.

To open this notebook click the following icon:

Open In Colab

The easiest way to run the maze program is to select “Run all” from the “Runtime” menu.   The program will generate and print to PDF a maze of the size specified on the Maze Parameters form.   To download this PDF check the “downloadPDF” box on the “Output” form and select “Run  the focused cell.”

For more detail Continue.

For an overview of my colabortory python notebooks take a look at Non Photo-Realistic Mona Lisa, Revisited.  This previous post introduces the manner in which I have structured my notebooks,  for this maze notebook, I have used a similar structure.

Regarding copyrights and license, The maze generation code originated in the Ruby programming language in the book “Mazes for Programmers: code your own twisty little passages” by Jamis Buck. Sami Salkosuo translated these into Python. I have added a new concept to the Distances class, that is distance off the solution path. Adapted the cell class to more directly support other cell shapes such as polar, hex, and 3D. Finally, I added the Cairo drawing routines to both the Cell and Grid classes.

Section 1: is a direct lift from Non Photo-Realistic Mona Lisa, Revisited. to install the Cairo drawing package

Section 2 defines the Cell class. The class describes the data and methods implementing each cell of the maze.  This data includes the cell’s neighbors and which of the neighbors are connected by missing walls.  When drawing the maze, fillCell is used to color the background, and drawCell constructs the walls.  Note, since the maze is drawn top to bottom, west to east, only the east and south walls are considered by drawCell.

Section 3 defines the Distances class. Jamis Buck and Sami Salkosuo both defined distances as the distance from a cell to all other cells.  I added defininitions for add  and sub so that algebraic combinations such as distance from the solution path are possible.

Section 4 defines the Grid class including the data and methods for implementing each maze.  My main additions are getDistancesFromPath which uses distance algebra to define distance off the path as the distance from start + distance from goal – path length, and the drawing routine drawGrid and draw Opening.  Unlike other maze programs I have written this one produces a PDF document.  First, it illustrates the unsolved maze and then on a second page provides the solution.  Most of the drawing code is directly from my work with scalable vector graphics.  To produce page filling pdf files, I used the Cairo scale and translate functions.  These provided the transformation from my arbitrary SVG space to the fixed paper mapping of PDF.

Section 5 implements the Wilson algorithm for maze generation. For Jamis Buck’s description of this algorithm and his original Ruby code follow this link.

Finally the main routine imports the required python packages, reads the user input from a form,  Generates a “Grid,” initializes it with “initWilsonMaze,” chooses random entry and exit columns with ” randint” and produces the PDF file with “drawGrid.”  displayMaze reads the binary png image from the Cairo surface and displays it in the notebook.

The last two cells are provided to download the pdf to your computer and to download setup.log if problems occur.

If you haven’t launched the app yet, click the icon.

Open In Colab

Non Photo-Realistic Mona Lisa, Revisited

Since I last wrote on stippling or half-toning algorithms in Non Photo-Realistic Mona Lisa, I have developed a new image tiling algorithm and finalized my plans and procedures for sharing python code.

Algorithm Update

My original algorithm divided the input photo into n equal black rectangles by slicing in one direction, then each of these rectangles was divided, … Although this method produced very recognizable results, it also produced patterns which were artifacts of the algorithm in the image.

The two images below illustrate the first slice into 5 rectangles of equal blackness, followed by a second slicing of each of the 5 into 5 more using the original algorithm.

In an attempt to break up the artifact patterns, I  have developed a new approach which uses recursion to slice up rectangles.  Instead of selecting n slices of equal black ink the new code first looks for the most significant slice it can remove containing 1/N’th of the ink from either end and then recursively calls itself to process N-1 slices which may now switch directions.

The figures above illustrate the approach for division by 5 followed by a second division by 5.  First, the algorithm finds the most massive horizontal fifth at the top end of the verticle image.  Next, one-fourth of the left side of the remainder is chosen.  Then the top third of the rest is selected, and then the final portion is split into left and right.   In the second pass, illustrated by the image on the right, each of these rectangles is processed similarly.  Note that the rectangles appear more random than those generated by the original algorithm.

What are these algorithms for.  First for some images the output rectangles have an incredible abstract beauty of their own.  My future plans include both geometric pattern half-tone images, were an icon of the correct size represents the actual black content, and photo mosaics where a small photo is placed in the center of each rectangle and when viewed from a distance they merge into the original image.

Run The Code

The big news is, I have finalized the procedures I will use to share the code I write.  Google’s Colabortory provides a Linux processor on which you can execute Python Jupyter Notebooks.  The easiest way would have been to store my Notebooks on a Google Drive and share them granting permission to view and comment to all.  However, the process of transferring ownership of a copy so that it can be executed and/or changed without modifying the original seemed overly complicated.  In the end, I decided to use Github.com for my storage of Notebooks.  Colabortory provides tools which open my Github repository Notebooks as copies owned by you,   Oh, you will need a Google account.  If you’re not currently logged in, colaboratory provides a button to start the log in process.

ColabWindow.png

Colabortory provides menus and buttons in the toolbar at the top of the page. Of specific interest now are the file menu which will allow you to save the Notebook (file type .ipnb) on your Google drive, in Github, or download to your computer, and the run menu in which the run all command can be used to execute the notebook.

At this point I suggest you open the Photo_SlicerTiler notebook. Click on the Icon below to open the Notebook in a new Tab or Window (depending on your browser settings).

Open In Colab

Under the Notebook toolbar, you will find two window panes. To the left is a Table of contents which allows navigation through the notebook and to the right the notebook code. The right pane is divided into cells which contain either text (Markdown formatted) or code. The runtime menu controls the execution of the notebook. Code cells can all be executed one after the other with “Run all,” or a few cells can be run with “Run Selected” or they can be “single stepped” with “Run the focused cell.”

The code cell of the first section “Setup Python environment” first checks for the existence of the Cairo drawing package and the Mona_Lisa.jpg file. If both exist, it exits. This avoids the setup overhead for multiple executions with changing parameters. The Setup also produces a Log to aid in locating problems.

Three Linux commands are used in the setup. Note the “!” is used to execute the Linux command from the python environment. When the “!” is used the output of the Linux command is returned as a string to python.

“apt-get” is part of the Debian/Ubuntu Linux package management system and is used to install libcairo2-dev the Cairo library.

“pip3” is the python3 package manager and is used to install pycairo the python interface to the binary library.

And finally “wget” is used to download Mona_Lisa.jpg from the drive.google.com using https. The sharable link to Mona_Lisa.jpg on my Google drive is https://drive.google.com/open?id=1_hUkSmQjXfxMwq170pOlRxdgf7hajgHd

The portion of this link following “id=” is the Google fileID. These fileIDs can be used in wget commands to download to your Colaboratory environment. The required form is:

wget -O ColabFileName https://drive.google.com/uc?export=download&id=FileID

The capability to run sharable notebooks from Github and the capacity to access data from a google drive with wget are two pillars of secure runnable shared code on Colaboratory.

Sections 2 and 3 are “libraries”. Section two defines the sliceIm and tileIm functions which implement the algorithms described at the top of this post. drawImage a function defined in section three creates a scalable vector graphics drawing of the rectangles produced by sliceIm or tileIm.

A great advantage of Cairo when used within Colaboratory/Jupyter notebooks is that regardless of the graphics backend used (SVG, PDF, JPEG, Microsoft-Windows …), one can always produce a PNG image file by calling cairo.Surface method write_to_png. The displayImage function uses write_to_png to produce an in-memory ByteString. The IPhython.core.display Image method converts the ByteString to type png image so that the IPhython.core.display.display method can produce the drawing in the notebook output.

The first cell of section 4 (The main program) first imports the required python packages and then reads the input image. Open, from the Python Image Library (PIL), supports most computer image file formats. Convert is used to convert color photos to black and white using industry standard conversion. The last two lines of the cell use the numpy package to produce a “negative” image array with black as 1.0 and white as 0.0.

The next cell contains a new construct. It implements a notebook form. A form is a simple user interface for entering data into the program. When you change data in the form, the code changes to assign the new value.

The second portion of the cell defines a main procedure which iterates over the divisors entered and slices or tiles each rectangle by the current divisor. Why is main defined and not just scripted python? Because if it’s a script the Cairo surface will remain defined, the output file will not be closed, and the download cell below will download an empty file. When we exit main, the surface is freed, and the file is closed.

The last two cells are conditional, on the checkbox of the form, downloads of the output and setup.log.

Please post comments about this blog using  the comment link.  If you have issues, suggestions, improvements etc. regarding the code go to https://github.com/doug14226/colab1/issues and open a new issue.

Next code example will geenerate mazes, until then try a little programing, add code to upload your photo and change the code to use it.

We can write code together!

I am really excited about the Google internet app Colaboratory.  For the first time, I can envision a community sharing computer ideas and code, without being concerned with whether they are running a Macintosh, Windows 10 or Linux machine.  In fact, users of iPads, Surface Go, and Android tablets can join in on the fun.

Colabortory provides a web-based implementation of iPython/Jupyter notebooks. Since it is web-based, an extra level of safety is ensured in sharing programs.  The code or its products will not be downloaded to your machine without your permission.  Colabortory also implements a “playground” mode where no files are written to your computer or to your google drive.

I have plans to write and publish the following programs to be shared with all:

Mazes, a maze generation “system” for fun, art or storytelling.
Football, machine learning to predict the Super  Bowl winner
Fantasy Football, machine learning to improve your fantasy team draft
Image Processing — new “filters” for your photos
Social Barometer — machine learning interpretation of Twitter posts from fans of your local sports team.
If you have other computer program suggestions, please forward them in your comments. To participate in this experiment, you will need a google account with google drive.  It’s free to set up that account and join in the fun.  drive.google.com
Finally, I promise to provide means for users to inspect all of my source code.  Also, I will never offer programs without source, such as byte compiled python .pyc or C executables.  I do not want a stranger’s compiled code and neither should you.

A-mazing books

One of the tricks I commonly use when visiting a bookstore is to force myself to buy only one book. It can’t be the book reviewed last weekend in the New York Times, nor can it be a book I always intended read, it has to be unusual, unlike any of my past reading.

I use this trick while in Seattle, January 2017. We had just watched the Chinese New Year street parade, with long dragons dancing down the street, followed by acrobats and dancers. With my mind charged with the exotic, I entered the downtown Barnes& Noble store where I preceded to shop. It’s common with me when picking just one book to select a translation of a foreign novel or a new edition of poetry by a poet I have never read. However, my one book has never addressed computer programming. But, when I saw “Mazes for Programmers: code your own twisty little passages” and read the back cover comments, “A book on mazes? Seriously? Yes! Because it’s fun. Remember when programming used to be fun?”, I was hooked.

Over the next several weeks I read the book cover to cover trying to understand all of the examples. One problem I had is, the author of the book had written all example code in Ruby, a language which I have never used. I had selected Python as my one language of choice to use for retirement hobby computing. The difference between two computer languages, even though they are in some ways similar, as Ruby and Python are both object-oriented languages, is complex. It’s like the computer translation of human communication. For an example of this difficulty, I took the first sentence above and translated it from English to Italian to Hungarian and back to English with Google translate, this is the result:  “One of the trick I buy in a bookstore is to force myself to choose only one book”

The problem of languages was solved when I discovered Sami Salkosuo’s posting of mayzepy on GitHub.  Someone else had done the hard part for me.  My work would now be concentrated on the creation of different maze types and the rendering of those mazes as computer drawings.

Almost immediately, as I started to render mazes, I realized that eventually, my software drawings would need to reference the past.  I needed to understand the stories of ancient mazes in Eygpt, the Greek myth of the Minotaur, and the meditative wall and pavement mazes of the medieval church.  Luckily the Buffalo library had a copy of “Mazes and Labyrinths: Their History and Development,” by W. H. Matthews.  I checked it out, and as I read it, I knew I wanted a copy for reference.  At this point, I discovered Amazon had a kindle version for $2.99.

Another direction of study when constructing mazes is the mathematics field of Graph Theory.  A “Perfect Maze,” one without loops, is a “Connected Tree” in graph theory.  The text I have been studying is “Pearls in Graph Theory” by Hartsfield and Ringel. I recommend it.

Finally here are examples of my computer generated mazes.

saved7polaryellowHorizon2

 

 

Local reading

When I arrived in Buffalo, fresh out of Iowa State, it was still a vibrant city, with a busy downtown, the retail shopping center of the area.  However, in 1965 my new hometown was on the precipice of decades of decline, many of the seeds of its decay were shared with other rust belt cities, and I need not dwell on them here.  However, Buffalo had another significant reason for the decline.  Buffalo had historically been the eastern end of Great Lakes Shipping.  Here mills ground grain from the Midwest into flour, and blast furnaces smelt iron ore from Minnesota and the Upper Peninsula of Michigan.  Unknown to Buffalo the world was changing fast, a significant reason for its decline had been completed six years before I arrived.  The Saint Lawrence Seaway, opening in 1959, had removed the primary reason for Buffalo’s birth as a major city, that is as a hub for Great Lakes shipping.   Buffalo grew from a small village to a city first because of the Erie Canel opening in 1825 and later as a railroad hub. The seaway allowed grain and ore to go cheaply to markets throughout the world.

Now over 50 years later Buffalo is again a growing city with a mixed economy of lighter manufacturing, education, medical care, and research, as well as information sciences. Now, I am retired with time to read, including local history. This year three of the books I read, all by local authors,  addressed local history.  I recommend all three authors and their books as sources to understand the almost 200-year history of Buffalo as an urban center.

The first is, a biography, Albright, the life and times of John J Albright, by Mark Goldman. Who is this man who we know by a building: the Albright-Knox Art Gallery? He came to Buffalo from Pennsylvania scouting for a new location for the Lackawanna Steel company, later sold to Bethlehem Steel.  He then moved on to both banking and electric power generation.  By 1895 he was on the board of the Buffalo Fine Arts Academy, and lead a drive to establish the home for the academy, which would bear his name.

Later, I read American Chartres: Buffalo’s Waterfront Grain Elevators by Bruce Jackson. Although one can drive along the Buffalo River and see Buffalo’s mostly defunct elevators, Bruce Jackson’s eye with his camera and his access behind locked gates provide a valuable adjunct to understanding these monuments to Buffalo’s past.  As to the title? Both the interior of the Cathedral at Chartres and the exterior of SIlo City  silently shout “look up.”

Today I finished The Best Planned City in the World, Olmsted, Vaux, and the Buffalo Park System by Francis R. Kowsky. An excellent history, with illustrations, of Olmsted and Vaux’s work in Buffalo.  I was familiar with Olmsted’s connection with Delaware Park, then “The Park” and Martin Luther King Park, then “The Parade” and the interconnecting Humboldt Parkway now lost to expressways.  But I did not know that he helped design, and fought for, every square and park in the city of Buffalo, and lead the international fight for the preservation of Niagara Falls.  Highly recommend reading to understand the design and history of Buffalo.

One more recommendation: Right Here, Right Now: The Buffalo Anthology by Jody K. Biehl (Editor), an exciting collection of history, stories, and essays by current and past residents of Buffalo.