<?xml version="1.0" encoding="UTF-8"?>
<?xml-stylesheet href="https://shkspr.mobi/blog/wp-content/themes/edent-wordpress-theme/rss-style.xsl" type="text/xsl"?>
<rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	    xmlns:wfw="http://wellformedweb.org/CommentAPI/"
	     xmlns:dc="http://purl.org/dc/elements/1.1/"
	   xmlns:atom="http://www.w3.org/2005/Atom"
	     xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	  xmlns:slash="http://purl.org/rss/1.0/modules/slash/"
	>
<channel>
	<title>development &#8211; Terence Eden’s Blog</title>
	<atom:link href="https://shkspr.mobi/blog/tag/development/feed/" rel="self" type="application/rss+xml" />
	<link>https://shkspr.mobi/blog</link>
	<description>Regular nonsense about tech and its effects 🙃</description>
	<lastBuildDate>Tue, 14 Apr 2026 21:08:35 +0000</lastBuildDate>
	<language>en-GB</language>
	<sy:updatePeriod>hourly</sy:updatePeriod>
	<sy:updateFrequency>1</sy:updateFrequency>
	<generator>https://wordpress.org/?v=6.9.4</generator>

 
	<item>
		<title><![CDATA[Post-It Notes aren't Agile - they're wallpaper]]></title>
		<link>https://shkspr.mobi/blog/2020/02/post-it-notes-arent-agile-theyre-wallpaper/</link>
					<comments>https://shkspr.mobi/blog/2020/02/post-it-notes-arent-agile-theyre-wallpaper/#comments</comments>
				<dc:creator><![CDATA[@edent]]></dc:creator>
		<pubDate>Wed, 12 Feb 2020 12:44:53 +0000</pubDate>
				<category><![CDATA[/etc/]]></category>
		<category><![CDATA[agile]]></category>
		<category><![CDATA[development]]></category>
		<category><![CDATA[paper]]></category>
		<guid isPermaLink="false">https://shkspr.mobi/blog/?p=33763</guid>

					<description><![CDATA[Post-it® notes are the life-blood of Agile. So we&#039;re told. Those little flaps of paper, usually hastily scribbled on, are the only way to prove you&#039;re Doing It Right™.  I&#039;m not a big fan. They&#039;re environmentally wasteful, inaccessible, and a bit crap for remote workers. But some people love them, so who am I to judge?  Recently, I visited a fairly large company who are making the painful tr…]]></description>
										<content:encoded><![CDATA[<p>Post-it® notes are the life-blood of Agile. So we're told. Those little flaps of paper, usually hastily scribbled on, are the only way to prove you're Doing It Right™.</p>

<p>I'm not a big fan. They're environmentally wasteful, inaccessible, and a bit crap for remote workers. But some people love them, so who am I to judge?</p>

<p>Recently, I visited a fairly large company who are making the painful transition from providing mega-software into to being a nimble, digital supplier.  Their walls were plentifully decorated with multi-coloured Post-it® notes.</p>

<p>Decorated being the operative word. A quick glance at them showed titles like "To-Do 2018" and "Easter Fire-Break" and "FAO Jerry".</p>

<p>"Who is Jerry?" I asked.</p>

<p>"Oh... I think he left a few months back," came the reply.</p>

<p>Now, not <em>all</em> of the Kanban Boards were outdated - some were obviously in use and had teams performing their daily rituals in front of them. But the majority seemed abandoned.</p>

<p>Perhaps abandoned is too strong a word. They were like cave paintings. Evidence of the hunt, sure, but now decorations to be marvelled at. A way to indoctrinate new members of the tribe.</p>

<p>Perhaps the Post-it® notes were good-luck charms. A steady stream of investors would have walked through the hallways and seen "evidence" of an advanced civilisation.</p>

<p>Perhaps the Post-it® notes were to ward off evil spirits. A cranky manager would have been mollified that his team were truly agile, and then left them alone in peace to carry on their waterfall development.</p>

<p>What I'm trying to say is this.  You can't put up wallpaper and pretend it is structural transformation.</p>
<img src="https://shkspr.mobi/blog/wp-content/themes/edent-wordpress-theme/info/okgo.php?ID=33763&HTTP_REFERER=RSS" alt="" width="1" height="1" loading="eager">]]></content:encoded>
					
					<wfw:commentRss>https://shkspr.mobi/blog/2020/02/post-it-notes-arent-agile-theyre-wallpaper/feed/</wfw:commentRss>
			<slash:comments>9</slash:comments>
		
		
			</item>
		<item>
		<title><![CDATA[Optical Theremin - Demo]]></title>
		<link>https://shkspr.mobi/blog/2012/06/optical-theremin-demo/</link>
					<comments>https://shkspr.mobi/blog/2012/06/optical-theremin-demo/#comments</comments>
				<dc:creator><![CDATA[@edent]]></dc:creator>
		<pubDate>Sun, 10 Jun 2012 15:08:49 +0000</pubDate>
				<category><![CDATA[mobile]]></category>
		<category><![CDATA[android]]></category>
		<category><![CDATA[demo]]></category>
		<category><![CDATA[development]]></category>
		<category><![CDATA[hacks]]></category>
		<category><![CDATA[ota12]]></category>
		<category><![CDATA[over the air]]></category>
		<category><![CDATA[theremin]]></category>
		<guid isPermaLink="false">http://shkspr.mobi/blog/?p=5897</guid>

					<description><![CDATA[At Over The Air I demonstrated what I considered a novel use for one of Android&#039;s sensors.  I wanted to create a Theremin - a type of musical instrument which is played by moving one&#039;s hand over it - changing pitch and tone by moving nearer or further away.    My first attempt used the proximity sensor.  However, on all the Android phones I tried the sensor&#039;s accuracy was binary - it could sense…]]></description>
										<content:encoded><![CDATA[<p>At <a href="https://shkspr.mobi/blog/2012/06/over-the-air-2012/">Over The Air</a> I demonstrated what I considered a novel use for one of Android's sensors.  I wanted to create a Theremin - a type of musical instrument which is played by moving one's hand over it - changing pitch and tone by moving nearer or further away.</p>

<img src="https://shkspr.mobi/blog/wp-content/uploads/2012/06/Edent-theremin-ota12.jpg" alt="" width="500" height="333" class="aligncenter size-full wp-image-15496">

<p>My first attempt used the <a href="http://developer.android.com/reference/android/hardware/Sensor.html#TYPE_PROXIMITY">proximity sensor</a>.  However, on all the Android phones I tried the sensor's accuracy was binary - it could sense if something was close by, but not say <em>how</em> close.</p>

<p>So, what else could I use to detect how near or far a hand was from the screen?  I decided to co-opt the <a href="http://developer.android.com/reference/android/hardware/Sensor.html#TYPE_LIGHT">Light Sensor</a>.  This is normally used to automatically adjust the brightness of the screen - making it easier to see in strong light.</p>

<p>When the light sensor is uncovered, the total lux (that's the measure of light) may be 100. As a hand moves closer to it, that value will dip until it reaches 0 (or, on my phone, 4).</p>

<p>We can then represent that light value as a sound - essentially transforming lx into Hz!</p>

<p>This is what is sounds like</p>

<audio controls="controls">
   <source src="https://shkspr.mobi/blog/wp-content/uploads/2012/06/Terence-Eden-Teremin.ogg">
   <source src="https://shkspr.mobi/blog/wp-content/uploads/2012/06/Terence-Eden-Teremin.mp3">
  Your browser does not support the audio element - <a href="https://shkspr.mobi/blog/wp-content/uploads/2012/06/Terence-Eden-Teremin.mp3">download the track</a>.
</audio>

<p>Beautiful, I'm sure you agree!  You can hear <a href="https://www.bbc.co.uk/programmes/p02swr6h">an interview where I discuss this app with the BBC's Jamillah Knowles on the Outriders Podcast</a> (22m 50s in).
</p><figure class="audio">
	<figcaption>🔊 Outriders 05 Jun 12: Moscow and Bletchley Park<br>🎤 BBC Radio 5 live</figcaption>
	
	<audio controls="" loading="lazy" src="https://shkspr.mobi/blog/wp-content/uploads/2012/06/OutridersPodcast-20120605-MoscowAndBletchleyPark.mp3">
		<p>💾 <a href="https://shkspr.mobi/blog/wp-content/uploads/2012/06/OutridersPodcast-20120605-MoscowAndBletchleyPark.mp3">Download this audio file</a>.</p>
	</audio>
</figure><p></p>

<p>If you want to have a play with it, the <a href="https://web.archive.org/web/20130623224607/https://play.google.com/store/apps/details?id=mobi.shkspr.android.theramin">Optical Theremin Demo is in the Google App Store</a>.  Do note, it was coded in a couple of sleep deprived hours, crashes when you exit, and can produce "music" which scares children and animals. You have been warned!</p>

<h2 id="use-the-source-luke"><a href="https://shkspr.mobi/blog/2012/06/optical-theremin-demo/#use-the-source-luke">Use The Source, Luke!</a></h2>

<p>I've included the full source below, but I'd like to pick out two points which may be of interest.</p>

<h3 id="getting-the-lux-value"><a href="https://shkspr.mobi/blog/2012/06/optical-theremin-demo/#getting-the-lux-value">Getting The Lux Value</a></h3>

<p>Firstly, we need to register a listener for the light sensor.</p>

<pre><code class="language-java">@Override public void onCreate(Bundle savedInstanceState) {
   super.onCreate(savedInstanceState);
   mSensorManager = (SensorManager) getSystemService(SENSOR_SERVICE);
   mLightSensor = mSensorManager.getDefaultSensor(Sensor.TYPE_LIGHT);
   mSensorManager.registerListener(this, mLightSensor, SensorManager.SENSOR_DELAY_FASTEST);
}
</code></pre>

<p>Every time the light sensor changes, this method will be called. It takes the light value and performs a simple mathematical transformation on it (adds 10, multiplies by 5).  I found that this gave the most pleasing sound - but you can adjust it to your tastes</p>

<pre><code class="language-java">@Override public void onSensorChanged(SensorEvent event){
   if (event.sensor.getType()==Sensor.TYPE_LIGHT){
       mLux = event.values[0];
       freqOfTone = (mLux +10) * 5;
   }
}
</code></pre>

<h3 id="cum-on-feel-the-noize"><a href="https://shkspr.mobi/blog/2012/06/optical-theremin-demo/#cum-on-feel-the-noize">Cum on Feel the Noize</a></h3>

<p>So, how do we get Android to generate a tone? I faffed around with <a href="http://stackoverflow.com/questions/2413426/playing-an-arbitrary-tone-with-android">this audio generating code from StackOverflow</a> until I could successfully generate a tone.</p>

<p>Essentially, this creates a WAV of a tone and gets it ready to play.</p>

<p>However, this sounded rather boring, so I added some reverb.</p>

<pre><code class="language-java">audioTrack.attachAuxEffect(EnvironmentalReverb.PARAM_DECAY_TIME);
</code></pre>

<p>And that's it!</p>

<p><a href="https://play.google.com/store/apps/details?id=mobi.shkspr.android.theramin">Download the Optical Theremin Demo App</a> - or use the source to create something much more melodious.</p>

<h3 id="full-source"><a href="https://shkspr.mobi/blog/2012/06/optical-theremin-demo/#full-source">Full Source</a></h3>

<pre><code class="language-java">package mobi.shkspr.android.theremin;

import java.util.Random;

import android.app.Activity;
import android.hardware.Sensor;
import android.hardware.SensorEvent;
import android.hardware.SensorEventListener;
import android.hardware.SensorManager;
import android.media.AudioFormat;
import android.media.AudioManager;
import android.media.AudioTrack;
import android.media.audiofx.EnvironmentalReverb;
import android.os.Bundle;
import android.os.Handler;
import android.util.Log;
import android.widget.TextView;

    public class TheraminActivity
        extends Activity
        implements SensorEventListener{
        // originally from http://marblemice.blogspot.com/2010/04/generate-and-play-tone-in-android.html
        // and modified by Steve Pomeroy steve@staticfree.info

        private final int duration = 5; // seconds
        private final int sampleRate = 8000;
        private final int numSamples = duration * sampleRate;
        private final double sample[] = new double[numSamples];
        private double freqOfTone = 440; // hz

        private final byte generatedSnd[] = new byte[2 * numSamples];

        private SensorManager mSensorManager;
        private Sensor mLightSensor;
        private float mLux = 0.0f;
        private String tLux = "Lux is ";

        public AudioTrack audioTrack;

        Handler handler = new Handler();

        @Override public void onCreate(Bundle savedInstanceState) {

            super.onCreate(savedInstanceState);


            mSensorManager = (SensorManager) getSystemService(SENSOR_SERVICE);
            mLightSensor = mSensorManager.getDefaultSensor(Sensor.TYPE_LIGHT);

            mSensorManager.registerListener(this, mLightSensor, SensorManager.SENSOR_DELAY_FASTEST);

        }

        @Override public void onSensorChanged(SensorEvent event){
            if (event.sensor.getType()==Sensor.TYPE_LIGHT){
                mLux = event.values[0];
                String luxStr = String.valueOf(mLux);
                TextView tv = new TextView(this);
                tv.setText(tLux);
                setContentView(tv);
                Random r = new Random();
                freqOfTone = (mLux +10) * 5;
            }

        }
        @Override protected void onResume() {
            super.onResume();
            final Thread thread = new Thread(new Runnable() {
                public void run() {

                    for (int i = 0; i &lt; 300; i++)   {
                        genTone();

                        audioTrack = new AudioTrack(
                                        AudioManager.STREAM_MUSIC,
                                        sampleRate,
                                        AudioFormat.CHANNEL_OUT_MONO,
                                        AudioFormat.ENCODING_PCM_16BIT,
                                        numSamples,
                                        AudioTrack.MODE_STATIC);


                            try {

                                playSound();
                                Thread.sleep(505);
                            } catch (IllegalStateException e) {

                            } catch (InterruptedException e) {
                                // TODO Auto-generated catch block
                                e.printStackTrace();
                            }
                    }
                }
            });

            thread.start();
        }

        void genTone(){ // fill out the array
            tLux = "Frequency is " + freqOfTone;
            //Log.d("LUXTAG", "Lux value: " + tLux);

            for (int i = 0; i &lt; numSamples; ++i) {
                sample[i] = Math.sin(2 * Math.PI * i /(sampleRate/freqOfTone));
            }

        // convert to 16 bit pcm sound array
        // assumes the sample buffer is normalised.
            int idx = 0; for (final double dVal : sample) {
                // scale to maximum amplitude
                final short val = (short) ((dVal * 32767)); // in 16 bit wav PCM, first byte is the low order byte
                generatedSnd[idx++] = (byte) (val &amp; 0x00ff);
                generatedSnd[idx++] = (byte) ((val &amp; 0xff00) &gt;&gt;&gt; 8);

            }
        }

        void playSound(){
            genTone();
            try {                   audioTrack.attachAuxEffect(EnvironmentalReverb.PARAM_DECAY_TIME);
                audioTrack.write(generatedSnd, 0, generatedSnd.length); audioTrack.play();
            } catch (IllegalStateException e) {
                audioTrack.release();
            }
        }

        @Override
        public void onAccuracyChanged(Sensor sensor, int accuracy) {
            // TODO Auto-generated method stub
        }

        @Override
        public void onPause() {
            super.onPause();
            audioTrack.stop();
            audioTrack.flush();
            audioTrack.release();
        }

        @Override
        public void onStop() {
            super.onStop();
            audioTrack.stop();
            audioTrack.flush();
            audioTrack.release();
        }

        @Override
        public void onDestroy() {
            super.onDestroy();
            audioTrack.flush();
            audioTrack.stop();
            audioTrack.flush();
            audioTrack.release();
        }
    }
</code></pre>
<img src="https://shkspr.mobi/blog/wp-content/themes/edent-wordpress-theme/info/okgo.php?ID=5897&HTTP_REFERER=RSS" alt="" width="1" height="1" loading="eager">]]></content:encoded>
					
					<wfw:commentRss>https://shkspr.mobi/blog/2012/06/optical-theremin-demo/feed/</wfw:commentRss>
			<slash:comments>3</slash:comments>
		
		<enclosure url="https://shkspr.mobi/blog/wp-content/uploads/2012/06/Terence-Eden-Teremin.ogg" length="121342" type="audio/ogg" />
<enclosure url="https://shkspr.mobi/blog/wp-content/uploads/2012/06/Terence-Eden-Teremin.mp3" length="198628" type="audio/mpeg" />
<enclosure url="https://shkspr.mobi/blog/wp-content/uploads/2012/06/OutridersPodcast-20120605-MoscowAndBletchleyPark.mp3" length="1227152" type="audio/mpeg" />

			</item>
		<item>
		<title><![CDATA[Should < img > Deprecate "height" and "width"?]]></title>
		<link>https://shkspr.mobi/blog/2012/04/should-img-deprecate-height-and-width/</link>
					<comments>https://shkspr.mobi/blog/2012/04/should-img-deprecate-height-and-width/#comments</comments>
				<dc:creator><![CDATA[@edent]]></dc:creator>
		<pubDate>Sat, 14 Apr 2012 07:08:28 +0000</pubDate>
				<category><![CDATA[mobile]]></category>
		<category><![CDATA[content]]></category>
		<category><![CDATA[development]]></category>
		<category><![CDATA[HTML]]></category>
		<category><![CDATA[HTML5]]></category>
		<category><![CDATA[img]]></category>
		<category><![CDATA[web]]></category>
		<guid isPermaLink="false">http://shkspr.mobi/blog/?p=5523</guid>

					<description><![CDATA[Image adaptation and resizing is a hot topic at the moment.  With devices of varying screensize accessing your site, how do you ensure that the crappy 240*240 phone gets a reasonable experience while still making everything look gorgeous on the retina-busting iPad?  One of the very first things we&#039;re taught in HTML school is that we should separate content and style.  &#38;lt;span font=&#34;comic sans&#34;…]]></description>
										<content:encoded><![CDATA[<p>Image adaptation and resizing is a hot topic at the moment.  With devices of varying screensize accessing your site, how do you ensure that the crappy 240*240 phone gets a reasonable experience while still making everything look gorgeous on the retina-busting iPad?</p>

<p>One of the very first things we're taught in HTML school is that we should separate content and style.</p>

<pre><code class="language-html">&amp;lt;span font="comic sans" colour="red"&amp;gt;This is wrong!&amp;lt;/span&amp;gt;
</code></pre>

<p>Instead, we should be doing</p>

<pre><code class="language-html">&amp;lt;span class="stylish"&amp;gt;This is corrent!&amp;lt;/span&amp;gt;
</code></pre>

<p>Yet, the very next thing we're taught is</p>

<pre><code class="language-html">&amp;lt;img src="example.jpg" height="120" width="90" /&amp;gt;
</code></pre>

<p>Well hang on a second! We've mixed up content (example.jpg) with presentation (the dimensions of the image).  The image will almost certainly have to be resized based on the screen size accessing the site.  Which means all manner of crufty JavaScript and CSS hacks to get it to display perfectly.</p>

<h2 id="the-right-way"><a href="https://shkspr.mobi/blog/2012/04/should-img-deprecate-height-and-width/#the-right-way">The Right Way</a></h2>

<p>Here is how I think the image tag should work.</p>

<pre><code class="language-html">&amp;lt;img src="example" class="icon" /&amp;gt;
</code></pre>

<p>The first thing to note is that the image shouldn't have a file extension.  As I've set out <a href="https://brucelawson.co.uk/2011/notes-on-adaptive-images-yet-again/#comment-851202">in a comment on Bruce Lawson's blog, the server should be looking at the HTTP accept headers to see what image type to serve up</a>.  If the device is capable of displaying SVG - that's what should be sent.  If the device is too old to support PNG - the image should be served up as JPG (or whatever format the device accepts).</p>

<p>Again, the <em>content</em> of the image should be separated from the <em>presentation</em> (i.e. the file format).</p>

<p>Secondly, we drop the height and the width from the img tag. In the olden days, they were needed to stop the page from dramatically reflowing as images loaded. That's still a valid concern today, but the challenge is that we don't know what physical size the image will have until it is requested.</p>

<p>"So what?" I hear you cry "We can already do this in CSS.  Images can have their dimensions set by absolute pixel size and / or relative size."</p>

<p>Indeed, you are correct.  But, <a href="https://web.archive.org/web/20120408030505/https://dev.w3.org/html5/markup/img.html">the HTML5 spec currently lists height and width</a> as attributes which <a href="https://web.archive.org/web/20120410222159/http://dev.w3.org/html5/spec/dimension-attributes.html">may be used</a>. This, I believe, acts to tempt the unwary developer into using them. They should be as obsolete as "align" and "border".</p>

<p>Ideally, the logic should be on the server-side. Your CSS shouldn't be asking the device for its own properties, your server should be dynamically generating CSS which suits the User-Agent.  The server should be adapting images on the fly (and cacheing them) depending on the resolution of the devices.</p>

<p>We should be writing ridiculously simple HTML5.</p>

<p>As <a href="http://www.brucelawson.co.uk/2011/notes-on-adaptive-images-yet-again/#comment-851202">I've said before</a></p>

<blockquote><p>Computers are there to do the hard work for us. We shouldn’t be writing extra markup in every single new document.
<br>
Get the silicon slaves to do it all.</p></blockquote>
<img src="https://shkspr.mobi/blog/wp-content/themes/edent-wordpress-theme/info/okgo.php?ID=5523&HTTP_REFERER=RSS" alt="" width="1" height="1" loading="eager">]]></content:encoded>
					
					<wfw:commentRss>https://shkspr.mobi/blog/2012/04/should-img-deprecate-height-and-width/feed/</wfw:commentRss>
			<slash:comments>1</slash:comments>
		
		
			</item>
	</channel>
</rss>
