I decided to build a Raspberry Pi cluster to give me a platform with which I can practice distributed computing technologies without needing to rely on a cloud provider.
This first post details my hardware set-up as well as how I used Ansible to “remote control” the installation of monitoring software on each of the Pi hosts, with the goal to observe the Raspberry Pi CPU temperatures.
Diskplayer is an audio device which uses physical media to play streaming music with 3.5” floppy disks and Spotify. You can find a GitHub repo with code here: https://github.com/dinofizz/diskplayer and a video showing playback and record activity here: https://youtu.be/1usBGe_ZiGc
[2022-04-24] See my new post on running mitmproxy on a Raspberry Pi 4 here.
[2020-06-21] I have done another run through of this tutorial on my Rasperry Pi 3, this time with the latest Raspberry Pi OS. Changes made to the tutorial are indicated with a note featuring a timestamp “[2020-06-21]”. Let me know in the comments if you are unsuccessful. I try to re-run everything every 6 months or so.
[2019-08-03] I have since updated this post with new instructions for running mitmproxy on Raspbian Buster, which now includes Python 3.7.
In preparation for a training session I will be giving on public key infrastructure (with a focus on TLS and certificates) I wanted to demonstrate how a transparent “man-in-the-middle” (MITM) proxy works.
This post walks through the configuration of a Raspberry Pi 3 acting as a Wi-Fi access point, running a transparent man-in-the-middle proxy (mitmproxy), which can be used to sniff HTTP and https traffic on connected devices.
This is part 4 of a 4 part series of articles where I explain how I discovered and purchased my laptop by building a web application which scrapes a local PC parts forum and sends automated email alerts when posts featuring specific keywords appear:
As part of my quest to learn new things I wanted to deploy my CarbAlert solution using Docker and Docker Compose. Docker Compose is well suited to this application as it enables a group of related Docker containers to be built and deployed together.
This is part 3 of a 4 part series of articles where I explain how I discovered and purchased my laptop by building a web application which scrapes a local PC parts forum and sends automated email alerts when posts featuring specific keywords appear:
Celery is a distributed task queue framework. In conjunction with a message broker (in my case Redis) it can be used to process asynchronous tasks as well as schedule periodic tasks. I am using both of these features:
A periodic task is run every 5 minutes to initiate the Scrapy CarbSpider to scrape and scan the first page of the Carbonite Laptop forum index page for new threads featuring search phrases of interest.
From within the CarbPipeline activity I push asynchronous email tasks for Celery to handle. This separates the sending of my email notifications from the parsing of the thread metadata.
This is part 2 of a 4 part series of articles where I explain how I discovered and purchased my laptop by building a web application which scrapes a local PC parts forum and sends automated email alerts when posts featuring specific keywords appear:
In order to manage the search phrases and email addresses I am using Django. Django is a Python web framework, and is known for including many extras right out of the box. I am taking advantage of two specific extras: Django’s built-in ORM and the Django admin console.