Ripley v2.0: Let Her RIP

What’s In A Name?

During this re-design phase, I’ve been thinking about what Ripley actually is. She’s not really a robot as she doesn’t perform any specific tasks, she’s not a drone as a) she isn’t remotely controlled and b) she has no specific purpose and nor is she an android as she doesn’t look remotely human.

Eventually I did discover a way to describe her: she’s a RIP: Roving Intelligent Platform:

  • Roving: She moves around under her own power and can be given planned routes or decide her own, depending on the environment.
  • Intelligent: She can make decisions based on what her senses tell her about her immediate environment and resolve conflicts should they arise.
  • Platform: She has no current purpose except that of an experiment, but can be tailored to whatever function is required.

I hadn’t named her with the RIP acronym in mind but, fortunately, it fits quite nicely.


Intelligent Design?

As a basis for the redesign, I have managed to cobble together the following diagram, mainly to organise my thoughts into some form of coherent pattern and also to give me a bit of a starting point. I’ll eventually expand on each of the topics (the bits with the thick blue lines) to a point where there is actual code but, for now, this will serve as the basic description.


RIPley: Initial Design Layout

As you can see, the left-hand part of the diagram is the hardware/software interface, featuring the RC2014 & the Pi. The right-hand portion shows my thoughts on the management software and the intelligence functions (decision making, conflict resolution etc). This is wholly Pi-based, written in C/C++ and running under Linux (Ubuntu).


Hardware Management:

In the Hardware Management pane you can see the Operations topics of Forward, Rear & Tilt sensors and the camera. These are the functions that will control the hardware and are detailed in the Software Control pane on the left. These function names are leftovers from the original software build used to test Ripley’s hardware, prior to this re-design.

Further Operations functions are:

  • GPS (location services) which will be provided by theĀ Adafruit HAT GPS module,
  • GSM/GPRS (mobile data) provided by a DROK SIM800 GSM module &
  • WiFi services provided by the Pi’s builtin Broadcom wireless NIC.


All of these Operations functions are controlled by the Management functions:

  • Forward, Rear & Tilt sensors are managed by Spatial Management (SM), giving Ripley a sense of her external environment (what’s in front, what’s behind and the direction of any incline she’s on, as well as a feed from the GPS so she knows how far from her OP (origin point) she is).
  • Camera Management (CM) controls the camera, including pan/tilt/zoom and whether nor not image recognition is required or just a feed to the ‘net.
  • Location Management (LM) monitors the GPS feed and provides GPS data to other functions as required.
  • Remote Control Management (RCM) switches between WiFi and GSM/GPRS (depending on whether recognised WiFi is in range or not) to maintain the network connection and also manages remote control instructions when requested.



These functions provide Ripley’s ‘brain’, such that it is:

  • Image Recognition (IR) utilises basic object recognition to allow Ripley to distinguish between static objects (such as rocks, fences, walls etc) and mobile objects (cats, dogs, humans etc). This gives her the ability to decide whether or not to go round an object (as in a rock or a fence) or to wait for the object to move of its own accord (as in human or canine obstacles) or even to run away. IR works alongside the forward and rear ultrasonic sensors.
  • Collision Avoidance (CA) works alongside IR, Spatial & Location management to avoid roads (LM), mobile obstacles (IR) and static obstacles (SM).
  • Decision Making is fairly self-explanatory and works hand-in-hand with all of the other functions including :
  • Conflict Management. This function is for resolving conflicts in decisions, being the final arbiter. Not much is known about this function at the moment.



This has been a fairly short intro to the current layout. I’m sure that it will change with time but any changes will be documented here in the blog. As each topic is developed, I will post an individual explanation of each. I’m sure there is someone out there who will find this vaguely interesting.

As always, if so inclined then leave comments but keep them clean. Ta.

Christine x


Ripley v2.0: Z80 & The Art of Pi.

Please note: This blog also contains my thoughts and immediate ideas so things may seem to go off at a tangent but I want to keep a complete record. If I seem to be rambling don’t worry, I’ll get back on track eventually.

The original Ripley design utilised a Raspberry Pi and several chunks of control electronics built onto the chassis of an old radio controlled car that I had lying around. Since I started the project things have changed and so has Ripley.

After a conversation with a friend of mine recently, I have redesigned Ripley with a view to splitting up the hardware control and AI functions to increase response times. I had a worry that, despite using a fast processor and running everything on Linux, there would still be too much of a lag between AI decisions and hardware actuations.

This has resulted in the hardware being controlled by an RC2014 , (a Z80-based machine designed by Semachthemonkey) with the control functions written in Z80 assembler (my native language) and the Pi 3B, running the AI and image recognition software, written in C/C++, (my second language), under a Linux OS, which will also handle the networking and GPRS comms.

The following is a diagram showing an idea of how this will look on the hardware side. As I build it, I’ll document it fully, but this will serve to demonstrate the proposition (and if you can tell me where the line in italics comes from then you win my absolute admiration for geekiness).


Mindmap of Ripley, utilising the RC2014 & Raspberry Pi 3B


The original design called for the Pi to play both the part of hardware controller, comms manager and AI. Even with the quad core ARM processor and running Linux for the OS, this would probably be a difficult task, especially as a camera and image recognition is involved.

It may be that the camera streaming and comms can be handled by the RC2014, taking more of the load off the Pi, leaving more resources for AI and database processing.


Much of the hardware is readily available. Most of it I ordered online via Amazon but, some of it is kit that I had lying around, eg: the chassis with servos and drive motor, power supply (7.2v 7200 mAh battery), camera. The connectors, ultrasonic sensors & stepper motor are part of an electronics kit designed for use with the Pi. The power distribution, H-Bridge motor control & servo control are all purchased seperately along with, of course, the Pi and RC2014 themselves. I will provide a complete list of hardware later.



The RC2014 uses Z80 assembler as its native language so it’s fast with very little lag in processing, despite the 7.3MHz clock speed. The digital I/O board can be adapted to interface with the digital connections of the control hardware.

The Pi, in this case, uses a cut down version of Ubuntu 14.04 with the control and AI software written in C/C++ via MS Visual Studio, (connected to the Pi via SSH), for source control (GIT). The Pi will still need to have the wiringPi interface software installed for the GPRS and image recognition but, otherwise, it will use either the RC2014’s 115kbaud serial interface or, (if I can create the interface), the address/data buses to communicate.

For reference, Ripley_v1_Main_v3_A (pdf) is the last original design for Ripley.

So, that’s a quick overview of how things stand. I can’t guarantee regular updates, my job is unpredictable but I will keep updating as best I can. Of course, commments are welcome, just keep them clean.

Christine x