Showing posts from January, 2008

Cellular Automata - Part II, Using Condor

My post on Cellular Automata from January 12 was not put there by mistake. I want to use it as a starting point for a couple of exercises in my Parallel Processing course.
In that post I gave a few drawings that differ only by the generating rule number.

Today, I am going to show how using the Condor High-Throughput Computing system allows to handle in a very simple way large volume of computations.

I used this simple Condor submit file:

universe = vanilla
executable =
Error = err.$(Process)
Output= out.$(Process)
Log = log.$(Process)
Arguments = $(Process)
Queue 256

And was able to compute the whole set of 256 rules (jobs) with the same effort of computing a single rule.
I submitted the task to my Personal Condor on my laptop and was not disappointed; After a while I got all the outputs happily waiting for post processing.

A special lecture at the BGU by Barton Miller

Distinguished Lecture Guest: Prof. Barton P. Miller
Computer Science Department
University of Wisconsin - Madison
Monday, January 28th, 2008 14:00-16:00 in the Saal Auditorium (202), Alon Hi-Tech Bldg (37)
at the Ben-Gurion University of the Negev, Beer-Sheva

A Framework for Binary Code Analysis and Static and Dynamic Patching
Barton P. Miller
Computer Sciences Department
University of Wisconsin
Madison, WI 53706

Tools that analyze and modify binary code are crucial to many areas of computer science, including cyber forensics, program tracing, debugging, testing, performance profiling, performance modeling, and software engineering. While there are many tools used to support these activities, these tools have significant limitations in functionality, efficiency, accuracy, portability, and availability.

To overcome these limitations, we are actively working on the design and implementation of a new framewor…

The mysteries of Cellular Automata

I recently bought Stephen Wolfram's book "New Kind of Science". The book is interesting and I highly recommend it.
In the spirit of one of my favorite phrases by Confucius: "I hear and I forget. I see and I remember. I do and I understand" I decided to reproduce some of the first examples that are given in the book. I wrote about 100 lines of Python code and enjoyed the beauty of the results.
The whole book is available online and I am referring here to the plots on page 55. Below enclosed a few of my plots.

Rule 25:

Rule 22:

Rule 30:

Rule 60:

Rule 73:

It is still a mystery for me the richness of the patterns that are produced from very simple interaction rules.
I would call it Social Networking by pixels.

The next IGT HPC work group meeting

Monday, January 14th, 2008
IGT Offices, Maskit 4, 5th Floor, Hertzliya
14:00-14:15: OPENING - Avner & Guy14:15-15:00: “GridGain – Java Grid Computing Made Simple”Speaker: Nikita Ivanov, Founder GridGain Systems
Duration: 45minutes
Language: English

This presentation is intended to provide a high level overview on
GridGain– an open source Java grid computing framework. Presentation is
arranged to provide both the canonical overview of the software
framework and live coding demonstration underscoring powerful simplicity
of the GridGain. Presentation is split into approximately two equal
* In first part the formal introduction to GridGain and
computational grid computing is provided. Different types ofgrid computing will be briefly discussed as well as the key
features that GridGain provides
* In the second part the live coding example will be shown
.demonstrating building and running the grid application from
scratch in front of the audience. This demonstration will
highlight one of …

First trials with Hadoop

I followed the HadoopQuickstart guide and the whole process is described below.

This post can be used as a reference for other people installing Hadoop.

My system is OpenSuse 10.3 and Java version is 1.6.0_03.

After downloading and installing the package I did the Standalone operation test:

$ mkdir input
$ cp conf/*.xml input
$ bin/hadoop jar hadoop-*-examples.jar grep input output 'dfs[a-z.]+'
$ cat output/*

Here is the output (line feeds may be corrupted, sorry for that):

gtz2:/home/telzur/downloads/hadoop-0.14.4 # bin/hadoop jar hadoop-0.14.4-examples.jar grep input output 'dfs[a-z.]+'08/01/05 15:47:13 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=08/01/05 15:47:13 INFO mapred.FileInputFormat: Total input paths to process : 308/01/05 15:47:13 INFO mapred.JobClient: Running job: job_local_108/01/05 15:47:13 INFO mapred.MapTask: numReduceTasks: 108/01/05 15:47:13 INFO mapred.LocalJobRunner: file:/home/telzur/downloads/…

Howto encrypt/decrypt a file using openssl

To encrypt:
# openssl bf -a -salt -in original_file.odt -out
you will be prompt to type and then re-type a password

here: bf - stands for the Blow Fish algorithm

To decrypt:
# openssl bf -d -salt -a -in ./ -out ./original_file.odt
use the same password when asked.

-d stands for decryption

For more information and examples type: man enc