<?xml version="1.0" encoding="utf-8" standalone="yes"?><rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom"><channel><title>Posts on Dino Fizzotti</title><link>https://www.dinofizzotti.com/post/</link><description>Recent content in Posts on Dino Fizzotti</description><generator>Hugo -- gohugo.io</generator><language>en-us</language><lastBuildDate>Sun, 24 Apr 2022 00:00:00 +0000</lastBuildDate><atom:link href="https://www.dinofizzotti.com/post/index.xml" rel="self" type="application/rss+xml"/><item><title>Running a man-in-the-middle proxy on a Raspberry Pi 4</title><link>https://www.dinofizzotti.com/blog/2022-04-24-running-a-man-in-the-middle-proxy-on-a-raspberry-pi-4/</link><pubDate>Sun, 24 Apr 2022 00:00:00 +0000</pubDate><guid>https://www.dinofizzotti.com/blog/2022-04-24-running-a-man-in-the-middle-proxy-on-a-raspberry-pi-4/</guid><description>&lt;p>&lt;img src="rpi4mitmproxy800x600.jpg" alt="Raspberry Pi 4 &amp; mimtproxy">&lt;/p>
&lt;p>This post is an update to my 2019 page on &lt;a href="https://www.dinofizzotti.com/blog/2019-01-09-running-a-man-in-the-middle-proxy-on-a-raspberry-pi-3/">Running a man-in-the-middle proxy on a Raspberry Pi 3&lt;/a>, now revisited and rewritten to accommodate using a Raspberry Pi 4, the current version of mitmproxy (v8.0.0), Raspberry Pi OS (bullseye) as well as changes to how some of the software is installed and configured.&lt;/p>
&lt;p>I have repeated much of the original content, especially the overview and explanations, so you do not need to refer back to the original 2019 post.&lt;/p></description></item><item><title>SiteMapper Part 2: Distributed crawling using Kubernetes, NATS and Cassandra</title><link>https://www.dinofizzotti.com/blog/2022-01-04-sitemapper-part-2-distributed-crawling-using-kubernetes-nats-and-cassandra/</link><pubDate>Tue, 04 Jan 2022 01:00:00 +0000</pubDate><guid>https://www.dinofizzotti.com/blog/2022-01-04-sitemapper-part-2-distributed-crawling-using-kubernetes-nats-and-cassandra/</guid><description>&lt;p>&lt;img src="sm1024.png" alt="Image showing logos of technologies used in this project">&lt;/p>
&lt;p>In &lt;a href="https://www.dinofizzotti.com/blog/2022-01-04-sitemapper-part-1-exploring-concurrency-in-go/">Part 1&lt;/a> of this project series I created a stand-alone &lt;a href="https://github.com/dinofizz/sitemapper">CLI tool&lt;/a> written Go to build a sitemap of internal links for a given URL to a specified maximum depth. In Part 2 I describe I how achieved the same result of creating a sitemap, but by distributing the crawl activity using the Kubernetes API to schedule independent ephemeral crawl jobs for each link. I&amp;rsquo;m using &lt;a href="https://nats.io/">NATS&lt;/a> for pod-to-pod messaging and &lt;a href="https://docs.datastax.com/en/astra/docs/">AstraDB&lt;/a> (a managed &lt;a href="https://cassandra.apache.org/_/index.html">Cassandra&lt;/a> DB) for persistence. All of the application code is written in Go.&lt;/p></description></item><item><title>SiteMapper Part 1: Exploring concurrency in Go</title><link>https://www.dinofizzotti.com/blog/2022-01-04-sitemapper-part-1-exploring-concurrency-in-go/</link><pubDate>Tue, 04 Jan 2022 00:00:00 +0000</pubDate><guid>https://www.dinofizzotti.com/blog/2022-01-04-sitemapper-part-1-exploring-concurrency-in-go/</guid><description>&lt;div class="highlight">&lt;pre style="color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4">&lt;code class="language-bash" data-lang="bash">$ ./sm -s https://google.com | jq
2021/12/31 14:57:26 Using mode: concurrent
2021/12/31 14:57:26 Crawling https://google.com with depth &lt;span style="color:#ae81ff">1&lt;/span>
2021/12/31 14:57:26 visiting URL https://google.com at depth &lt;span style="color:#ae81ff">0&lt;/span> with parent https://google.com
2021/12/31 14:57:26 Elapsed milliseconds: &lt;span style="color:#ae81ff">263&lt;/span>
&lt;span style="color:#f92672">[&lt;/span>
&lt;span style="color:#f92672">{&lt;/span>
&lt;span style="color:#e6db74">&amp;#34;URL&amp;#34;&lt;/span>: &lt;span style="color:#e6db74">&amp;#34;https://google.com&amp;#34;&lt;/span>,
&lt;span style="color:#e6db74">&amp;#34;Links&amp;#34;&lt;/span>: &lt;span style="color:#f92672">[&lt;/span>
&lt;span style="color:#e6db74">&amp;#34;https://accounts.google.com/ServiceLogin&amp;#34;&lt;/span>,
&lt;span style="color:#e6db74">&amp;#34;https://drive.google.com/&amp;#34;&lt;/span>,
&lt;span style="color:#e6db74">&amp;#34;https://mail.google.com/mail/&amp;#34;&lt;/span>,
&lt;span style="color:#e6db74">&amp;#34;https://news.google.com/&amp;#34;&lt;/span>,
&lt;span style="color:#e6db74">&amp;#34;https://play.google.com/&amp;#34;&lt;/span>,
&lt;span style="color:#e6db74">&amp;#34;https://www.google.com/advanced_search&amp;#34;&lt;/span>,
&lt;span style="color:#e6db74">&amp;#34;https://www.google.com/intl/en/about.html&amp;#34;&lt;/span>,
&lt;span style="color:#e6db74">&amp;#34;https://www.google.com/intl/en/ads/&amp;#34;&lt;/span>,
&lt;span style="color:#e6db74">&amp;#34;https://www.google.com/intl/en/policies/privacy/&amp;#34;&lt;/span>,
&lt;span style="color:#e6db74">&amp;#34;https://www.google.com/intl/en/policies/terms/&amp;#34;&lt;/span>,
&lt;span style="color:#e6db74">&amp;#34;https://www.google.com/preferences&amp;#34;&lt;/span>,
&lt;span style="color:#e6db74">&amp;#34;https://www.google.com/search&amp;#34;&lt;/span>,
&lt;span style="color:#e6db74">&amp;#34;https://www.google.com/services/&amp;#34;&lt;/span>,
&lt;span style="color:#e6db74">&amp;#34;https://www.google.com/setprefdomain&amp;#34;&lt;/span>
&lt;span style="color:#f92672">]&lt;/span>
&lt;span style="color:#f92672">}&lt;/span>
&lt;span style="color:#f92672">]&lt;/span>
&lt;/code>&lt;/pre>&lt;/div>&lt;p>I&amp;rsquo;ve really been enjoying my journey in becoming more familiar with Go. One of Go&amp;rsquo;s strengths is its built-in concurrency primitives, namely &lt;a href="https://go.dev/tour/concurrency/1">goroutines&lt;/a> and &lt;a href="https://go.dev/tour/concurrency/2">channels&lt;/a>. I decided to practice my Go skills by building a tool which explores some of these features. &lt;a href="https://github.com/dinofizz/sitemapper">SiteMapper&lt;/a> accepts a root URL and a maximum depth, and then uses one of three modes of operation to crawl the site (synchronous, concurrent and concurrent-limited), building out a sitemap listing links to internal pages.&lt;/p>
&lt;p>Part 1 of this project details the stand-alone CLI tool implementation written in Go. &lt;a href="https://www.dinofizzotti.com/blog/2022-01-04-sitemapper-part-2-distributed-crawling-using-kubernetes-nats-and-cassandra/">Part 2&lt;/a> details an implementation using Kubernetes, where site crawling is performed by ephemeral Kubernetes job pods.&lt;/p></description></item><item><title>FnRow v1: A configurable function-row-layout mechanical keyboard</title><link>https://www.dinofizzotti.com/blog/2021-02-07-fnrow-v1-a-configurable-function-row-layout-mechanical-keyboard/</link><pubDate>Sun, 07 Feb 2021 00:00:00 +0000</pubDate><guid>https://www.dinofizzotti.com/blog/2021-02-07-fnrow-v1-a-configurable-function-row-layout-mechanical-keyboard/</guid><description>&lt;p>&lt;img src="fnrow_cropped.jpg" alt="FnRow">&lt;/p>
&lt;p>FnRow is a mechanical keyboard I designed and built during the Christmas break. It features a single row of switches in the form of a &amp;ldquo;function row&amp;rdquo;. Each switch is configurable and can be programmed to perform as any key on a typical keyboard, or even combinations of key presses. FnRow is akin to a &amp;ldquo;macropad&amp;rdquo;, but instead of having a square or rectangular &amp;ldquo;pad&amp;rdquo;, the switches are stretched out in a single row. All my hardware and software source files are available on &lt;a href="https://github.com/dinofizz/fnrow-pcb">GitHub&lt;/a>.&lt;/p></description></item><item><title>Raspberry Pi Cluster Part 3: Running Load Tests with Kubernetes and Locust</title><link>https://www.dinofizzotti.com/blog/2020-07-04-raspberry-pi-cluster-part-3-running-load-tests-with-kubernetes-and-locust/</link><pubDate>Sat, 04 Jul 2020 13:40:46 +0100</pubDate><guid>https://www.dinofizzotti.com/blog/2020-07-04-raspberry-pi-cluster-part-3-running-load-tests-with-kubernetes-and-locust/</guid><description>&lt;p>This post details how I used a Python based load test framework (Locust) to perform some simple tests on an HTTP API application using Kubernetes in my Raspberry Pi Cluster.&lt;/p>
&lt;p>&lt;img src="logos_scaled.png" alt="Logos for technologies used in this post.">&lt;/p></description></item><item><title>Raspberry Pi Cluster Part 2: ToDo API running on Kubernetes with k3s</title><link>https://www.dinofizzotti.com/blog/2020-05-09-raspberry-pi-cluster-part-2-todo-api-running-on-kubernetes-with-k3s/</link><pubDate>Sat, 09 May 2020 11:26:00 +0100</pubDate><guid>https://www.dinofizzotti.com/blog/2020-05-09-raspberry-pi-cluster-part-2-todo-api-running-on-kubernetes-with-k3s/</guid><description>&lt;p>In this post I go over how I set up my Kubernetes cluster across four Raspberry Pi 4 nodes using k3s, configured persistent storage using NFS, and then installed a simple &amp;ldquo;todo&amp;rdquo; API into the cluster using Helm.&lt;/p>
&lt;p>&lt;img src="cluster2.gif" alt="Pi Cluster">&lt;/p></description></item><item><title>Raspberry Pi Cluster Part 1: Provisioning with Ansible and temperature monitoring using Prometheus and Grafana</title><link>https://www.dinofizzotti.com/blog/2020-04-10-raspberry-pi-cluster-part-1-provisioning-with-ansible-and-temperature-monitoring-using-prometheus-and-grafana/</link><pubDate>Fri, 10 Apr 2020 12:46:42 +0100</pubDate><guid>https://www.dinofizzotti.com/blog/2020-04-10-raspberry-pi-cluster-part-1-provisioning-with-ansible-and-temperature-monitoring-using-prometheus-and-grafana/</guid><description>&lt;p>I decided to build a Raspberry Pi cluster to give me a platform with which I can practice distributed computing technologies without needing to rely on a cloud provider.&lt;/p>
&lt;figure >
&lt;a href="https://www.dinofizzotti.com/blog/2020-04-10-raspberry-pi-cluster-part-1-provisioning-with-ansible-and-temperature-monitoring-using-prometheus-and-grafana/IMG_3105.JPEG">
&lt;span class="caption-wrapper">
&lt;img class="caption" style="max-width: 100%; height: auto;" src="https://www.dinofizzotti.com/blog/2020-04-10-raspberry-pi-cluster-part-1-provisioning-with-ansible-and-temperature-monitoring-using-prometheus-and-grafana/IMG_3105_hu996cadf0dd189836f19898a2c78ecd5a_473080_1200x0_resize_q75_box.JPEG" width="1200" height="900" alt="Dino&amp;#39;s Pi Cluster">
&lt;/span>
&lt;/a>
&lt;/figure>
&lt;p>This first post details my hardware set-up as well as how I used Ansible to &amp;ldquo;remote control&amp;rdquo; the installation of monitoring software on each of the Pi hosts, with the goal to observe the Raspberry Pi CPU temperatures.&lt;/p></description></item><item><title>Diskplayer: Using 3.5" floppy disks to play albums on Spotify</title><link>https://www.dinofizzotti.com/blog/2020-02-05-diskplayer-using-3.5-floppy-disks-to-play-albums-on-spotify/</link><pubDate>Wed, 05 Feb 2020 22:13:04 +0000</pubDate><guid>https://www.dinofizzotti.com/blog/2020-02-05-diskplayer-using-3.5-floppy-disks-to-play-albums-on-spotify/</guid><description>&lt;figure >
&lt;a href="https://www.dinofizzotti.com/blog/2020-02-05-diskplayer-using-3.5-floppy-disks-to-play-albums-on-spotify/IMG_2207_cropped.JPG">
&lt;span class="caption-wrapper">
&lt;img class="caption" style="max-width: 100%; height: auto;" src="https://www.dinofizzotti.com/blog/2020-02-05-diskplayer-using-3.5-floppy-disks-to-play-albums-on-spotify/IMG_2207_cropped_hu51c6a3c4b5ecf70eeac4ebb0f01b9297_1564600_1200x0_resize_q75_box.JPG" width="1200" height="900" alt="Diskplayer">
&lt;/span>
&lt;/a>
&lt;/figure>
&lt;p>Diskplayer is an audio device which uses physical media to play streaming music with 3.5&amp;rdquo; floppy disks and Spotify. You can find a GitHub repo with code here: &lt;a href="https://github.com/dinofizz/diskplayer">https://github.com/dinofizz/diskplayer&lt;/a> and a video showing playback and record activity here: &lt;a href="https://youtu.be/1usBGe_ZiGc">https://youtu.be/1usBGe_ZiGc&lt;/a>&lt;/p></description></item><item><title>Running a man-in-the-middle proxy on a Raspberry Pi 3</title><link>https://www.dinofizzotti.com/blog/2019-01-09-running-a-man-in-the-middle-proxy-on-a-raspberry-pi-3/</link><pubDate>Wed, 09 Jan 2019 08:35:00 +0200</pubDate><guid>https://www.dinofizzotti.com/blog/2019-01-09-running-a-man-in-the-middle-proxy-on-a-raspberry-pi-3/</guid><description>&lt;figure >
&lt;a href="https://www.dinofizzotti.com/blog/2019-01-09-running-a-man-in-the-middle-proxy-on-a-raspberry-pi-3/rpi_mitmproxy.JPG">
&lt;span class="caption-wrapper">
&lt;img class="caption" style="max-width: 100%; height: auto;" src="https://www.dinofizzotti.com/blog/2019-01-09-running-a-man-in-the-middle-proxy-on-a-raspberry-pi-3/rpi_mitmproxy_huabe772d0f7cce00f9605e4c5e8419225_1363824_1200x0_resize_q75_box.JPG" width="1200" height="900" alt="Raspberry Pi 3 and mitmproxy">
&lt;/span>
&lt;/a>
&lt;/figure>
&lt;p>&lt;em>&lt;strong>[2022-04-24] See my new post on running mitmproxy on a Raspberry Pi 4 &lt;a href="https://www.dinofizzotti.com/blog/2022-04-24-running-a-man-in-the-middle-proxy-on-a-raspberry-pi-4/">here&lt;/a>.&lt;/strong>&lt;/em>&lt;/p>
&lt;p>&lt;em>[2020-06-21] I have done another run through of this tutorial on my Rasperry Pi 3, this time with the latest Raspberry Pi OS. Changes made to the tutorial are indicated with a note featuring a timestamp &amp;ldquo;[2020-06-21]&amp;rdquo;. Let me know in the comments if you are unsuccessful. I try to re-run everything every 6 months or so.&lt;/em>&lt;/p>
&lt;p>&lt;em>[2019-08-03] I have since updated this post with new instructions for running mitmproxy on Raspbian Buster, which now includes Python 3.7.&lt;/em>&lt;/p>
&lt;p>In preparation for a training session I will be giving on public key infrastructure (with a focus on TLS and certificates) I wanted to demonstrate how a transparent &amp;ldquo;&lt;a href="https://en.wikipedia.org/wiki/Man-in-the-middle_attack">man-in-the-middle&lt;/a>&amp;rdquo; (MITM) proxy works.&lt;/p>
&lt;p>This post walks through the configuration of a Raspberry Pi 3 acting as a Wi-Fi access point, running a transparent man-in-the-middle proxy (&lt;a href="https://mitmproxy.org/">mitmproxy&lt;/a>), which can be used to sniff HTTP and https traffic on connected devices.&lt;/p></description></item><item><title>CarbAlert - Part 4: Deploying and Using CarbAlert</title><link>https://www.dinofizzotti.com/blog/2018-10-14-carbalert-part-4-deploying-and-using-carbalert/</link><pubDate>Sun, 14 Oct 2018 08:27:43 +0200</pubDate><guid>https://www.dinofizzotti.com/blog/2018-10-14-carbalert-part-4-deploying-and-using-carbalert/</guid><description>&lt;p>This is part 4 of a 4 part series of articles where I explain how I discovered and purchased my laptop by building a web application which scrapes a local PC parts forum and sends automated email alerts when posts featuring specific keywords appear:&lt;/p>
&lt;ul>
&lt;li>&lt;a href="https://www.dinofizzotti.com/blog/2018-10-14-carbalert-part-1-let-your-next-laptop-find-you/">Part 1: Let Your Next Laptop Find YOU!&lt;/a>&lt;/li>
&lt;li>&lt;a href="https://www.dinofizzotti.com/blog/2018-10-14-carbalert-part-2-django-and-scrapy/">Part 2: Django and Scrapy&lt;/a>&lt;/li>
&lt;li>&lt;a href="https://www.dinofizzotti.com/blog/2018-10-14-carbalert-part-3-celery-mailgun-and-flower/">Part 3: Celery, Mailgun and Flower&lt;/a>&lt;/li>
&lt;li>Part 4: Deploying and using CarbAlert (this page)&lt;/li>
&lt;/ul>
&lt;p>CarbAlert on GitHub: &lt;a href="https://github.com/dinofizz/carbalert">https://github.com/dinofizz/carbalert&lt;/a>&lt;/p>
&lt;h1 id="docker">Docker&lt;/h1>
&lt;p>As part of my quest to learn new things I wanted to deploy my CarbAlert solution using &lt;a href="https://www.docker.com/">Docker&lt;/a> and &lt;a href="https://docs.docker.com/compose/">Docker Compose&lt;/a>. Docker Compose is well suited to this application as it enables a group of related Docker containers to be built and deployed together.&lt;/p></description></item><item><title>CarbAlert - Part 3: Celery, Mailgun and Flower</title><link>https://www.dinofizzotti.com/blog/2018-10-14-carbalert-part-3-celery-mailgun-and-flower/</link><pubDate>Sun, 14 Oct 2018 08:20:49 +0200</pubDate><guid>https://www.dinofizzotti.com/blog/2018-10-14-carbalert-part-3-celery-mailgun-and-flower/</guid><description>&lt;p>This is part 3 of a 4 part series of articles where I explain how I discovered and purchased my laptop by building a web application which scrapes a local PC parts forum and sends automated email alerts when posts featuring specific keywords appear:&lt;/p>
&lt;ul>
&lt;li>&lt;a href="https://www.dinofizzotti.com/blog/2018-10-14-carbalert-part-1-let-your-next-laptop-find-you/">Part 1: Let Your Next Laptop Find YOU!&lt;/a>&lt;/li>
&lt;li>&lt;a href="https://www.dinofizzotti.com/blog/2018-10-14-carbalert-part-2-django-and-scrapy/">Part 2: Django and Scrapy&lt;/a>&lt;/li>
&lt;li>Part 3: Celery, Mailgun and Flower (this page)&lt;/li>
&lt;li>&lt;a href="https://www.dinofizzotti.com/blog/2018-10-14-carbalert-part-4-deploying-and-using-carbalert/">Part 4: Deploying and using CarbAlert&lt;/a>&lt;/li>
&lt;/ul>
&lt;p>CarbAlert on GitHub: &lt;a href="https://github.com/dinofizz/carbalert">https://github.com/dinofizz/carbalert&lt;/a>&lt;/p>
&lt;h1 id="celery">Celery&lt;/h1>
&lt;p>&lt;a href="http://www.celeryproject.org/">Celery&lt;/a> is a distributed task queue framework. In conjunction with a message broker (in my case &lt;a href="https://redis.io/">Redis&lt;/a>) it can be used to process asynchronous tasks as well as schedule periodic tasks. I am using both of these features:&lt;/p>
&lt;ul>
&lt;li>A periodic task is run every 5 minutes to initiate the Scrapy &lt;code>CarbSpider&lt;/code> to scrape and scan the first page of the Carbonite Laptop forum index page for new threads featuring search phrases of interest.&lt;/li>
&lt;li>From within the &lt;code>CarbPipeline&lt;/code> activity I push asynchronous email tasks for Celery to handle. This separates the sending of my email notifications from the parsing of the thread metadata.&lt;/li>
&lt;/ul></description></item><item><title>CarbAlert - Part 2: Django and Scrapy</title><link>https://www.dinofizzotti.com/blog/2018-10-14-carbalert-part-2-django-and-scrapy/</link><pubDate>Sun, 14 Oct 2018 08:13:05 +0200</pubDate><guid>https://www.dinofizzotti.com/blog/2018-10-14-carbalert-part-2-django-and-scrapy/</guid><description>&lt;p>This is part 2 of a 4 part series of articles where I explain how I discovered and purchased my laptop by building a web application which scrapes a local PC parts forum and sends automated email alerts when posts featuring specific keywords appear:&lt;/p>
&lt;ul>
&lt;li>&lt;a href="https://www.dinofizzotti.com/blog/2018-10-14-carbalert-part-1-let-your-next-laptop-find-you/">Part 1: Let Your Next Laptop Find YOU!&lt;/a>&lt;/li>
&lt;li>Part 2: Django and Scrapy (this page)&lt;/li>
&lt;li>&lt;a href="https://www.dinofizzotti.com/blog/2018-10-14-carbalert-part-3-celery-mailgun-and-flower/">Part 3: Celery, Mailgun and Flower&lt;/a>&lt;/li>
&lt;li>&lt;a href="https://www.dinofizzotti.com/blog/2018-10-14-carbalert-part-4-deploying-and-using-carbalert/">Part 4: Deploying and using CarbAlert&lt;/a>&lt;/li>
&lt;/ul>
&lt;p>CarbAlert on GitHub: &lt;a href="https://github.com/dinofizz/carbalert">https://github.com/dinofizz/carbalert&lt;/a>&lt;/p>
&lt;h1 id="django">Django&lt;/h1>
&lt;p>In order to manage the search phrases and email addresses I am using &lt;a href="https://www.djangoproject.com/">Django&lt;/a>. Django is a Python web framework, and is known for including many extras right out of the box. I am taking advantage of two specific extras: Django&amp;rsquo;s built-in &lt;a href="https://en.wikipedia.org/wiki/Object-relational_mapping">ORM&lt;/a> and the Django admin console.&lt;/p></description></item><item><title>CarbAlert - Part 1: Let Your Next Laptop Find YOU!</title><link>https://www.dinofizzotti.com/blog/2018-10-14-carbalert-part-1-let-your-next-laptop-find-you/</link><pubDate>Sun, 14 Oct 2018 08:07:42 +0200</pubDate><guid>https://www.dinofizzotti.com/blog/2018-10-14-carbalert-part-1-let-your-next-laptop-find-you/</guid><description>&lt;p>This is part 1 of a 4 part series of articles where I explain how I discovered and purchased my laptop by building a web application which scrapes a local PC parts forum and sends automated email alerts when posts featuring specific keywords appear:&lt;/p>
&lt;ul>
&lt;li>Part 1: Let Your Next Laptop Find YOU! (this page)&lt;/li>
&lt;li>&lt;a href="https://www.dinofizzotti.com/blog/2018-10-14-carbalert-part-2-django-and-scrapy/">Part 2: Django and Scrapy&lt;/a>&lt;/li>
&lt;li>&lt;a href="https://www.dinofizzotti.com/blog/2018-10-14-carbalert-part-3-celery-mailgun-and-flower/">Part 3: Celery, Mailgun and Flower&lt;/a>&lt;/li>
&lt;li>&lt;a href="https://www.dinofizzotti.com/blog/2018-10-14-carbalert-part-4-deploying-and-using-carbalert/">Part 4: Deploying and using CarbAlert&lt;/a>&lt;/li>
&lt;/ul>
&lt;p>CarbAlert on GitHub: &lt;a href="https://github.com/dinofizz/carbalert">https://github.com/dinofizz/carbalert&lt;/a>&lt;/p>
&lt;h1 id="tldr">TL;DR&lt;/h1>
&lt;p>
&lt;figure >
&lt;a href="https://www.dinofizzotti.com/blog/2018-10-14-carbalert-part-1-let-your-next-laptop-find-you/emails.png">
&lt;span class="caption-wrapper">
&lt;img class="caption" style="max-width: 100%; height: auto;" src="https://www.dinofizzotti.com/blog/2018-10-14-carbalert-part-1-let-your-next-laptop-find-you/emails_hua8d65ca21d0d5b5b894658c6c303f57d_274202_1200x0_resize_box_2.png" width="1200" height="675" alt="Web inbox">
&lt;/span>
&lt;/a>
&lt;/figure>
&lt;!-- raw HTML omitted -->&lt;/p>
&lt;p>CarbAlert is a web application which scrapes a local (South African) second-hand computer parts forum for new posts offering laptops featuring keywords of interest (specifically the &lt;a href="https://carbonite.co.za/index.php?forums/laptops.32/">first page of the &amp;ldquo;laptops&amp;rdquo; forum&lt;/a>) I&amp;rsquo;m using &lt;a href="https://www.djangoproject.com/">Django&lt;/a> for the admin console and database/ORM integration, &lt;a href="https://scrapy.org/">Scrapy&lt;/a> for web-scraping, &lt;a href="http://www.celeryproject.org/">Celery&lt;/a> for task management, &lt;a href="https://flower.readthedocs.io/en/latest/">Flower&lt;/a> for task monitoring and &lt;a href="https://www.mailgun.com/">Mailgun&lt;/a> for sending out alert emails. I am using &lt;a href="https://www.docker.com/">Docker&lt;/a> to manage and run the containers which make up the CarbAlert application.&lt;/p>
&lt;p>&lt;strong>Find the code here:&lt;/strong> &lt;a href="https://github.com/dinofizz/carbalert">https://github.com/dinofizz/carbalert&lt;/a>&lt;/p></description></item><item><title>NoiseBlanket: Arduino White Noise Player with IR Remote Control</title><link>https://www.dinofizzotti.com/blog/2017-08-21-noiseblanket-arduino-white-noise-player-with-ir-remote-control/</link><pubDate>Mon, 21 Aug 2017 00:00:00 +0000</pubDate><guid>https://www.dinofizzotti.com/blog/2017-08-21-noiseblanket-arduino-white-noise-player-with-ir-remote-control/</guid><description>Overview After a recent sinus/ear infection I began to experience tinnitus. In my case it presents itself as a constant high frequency static hiss/whine in my left ear. It&amp;rsquo;s been about a month since I first noticed it. Hopefully it will eventually disappear. During the day when I&amp;rsquo;m at work or around friends I don&amp;rsquo;t really notice it. However when I&amp;rsquo;m in a quiet room, such as when falling asleep or in the early morning after waking up, it is quite noticeable and distracting.</description></item><item><title>A Python wrapper for the Adafruit USB/Serial LCD Backpack</title><link>https://www.dinofizzotti.com/blog/2017-06-10-a-python-wrapper-for-the-adafruit-usb/serial-lcd-backpack/</link><pubDate>Sat, 10 Jun 2017 08:11:36 +0200</pubDate><guid>https://www.dinofizzotti.com/blog/2017-06-10-a-python-wrapper-for-the-adafruit-usb/serial-lcd-backpack/</guid><description>Overview I&amp;rsquo;m working on a project which requires a character LCD to work with a Raspberry Pi. A character LCD is thing displaying the text in the image above. The one I am currently using is a &amp;ldquo;green backlit 16x2&amp;rdquo; character LCD. This means that the display is capable of displaying 2 rows of 16 characters, and it features black characters on a green background. There are various sizes and colour combinations (text and backlight) available.</description></item><item><title>Adding Hugo version and commit information to a status page</title><link>https://www.dinofizzotti.com/blog/2017-05-01-adding-hugo-version-and-commit-information-to-a-status-page/</link><pubDate>Mon, 01 May 2017 18:25:44 +0200</pubDate><guid>https://www.dinofizzotti.com/blog/2017-05-01-adding-hugo-version-and-commit-information-to-a-status-page/</guid><description>As described in a previous post, I have set up a GitLab CI runner to build and deploy this blog.
For obvious content changes it is easy to see if they have been applied - I can just visit the site itself. For changes that are a bit more &amp;ldquo;behind the scenes&amp;rdquo; it may not be so easy to determine if and what changes have been applied. For example: updating the version of Hugo used by the GitLab CI runner to generate the site.</description></item><item><title>Domain Name Change</title><link>https://www.dinofizzotti.com/blog/2017-03-26-domain-name-change/</link><pubDate>Sun, 26 Mar 2017 09:04:34 +0200</pubDate><guid>https://www.dinofizzotti.com/blog/2017-03-26-domain-name-change/</guid><description>FYI I have decided to &amp;ldquo;re-brand&amp;rdquo; this blog under my own name. I have 301&amp;rsquo;d all the server blocks for practicalmagic.co.za to point to dinofizzotti.com.
I also promise to actually blog more often (sorry).</description></item><item><title>Automated blog posts with Hugo, GitLab CI and Docker</title><link>https://www.dinofizzotti.com/blog/2016-10-18-automated-blog-posts-with-hugo-gitlab-ci-and-docker/</link><pubDate>Tue, 18 Oct 2016 16:52:22 +0200</pubDate><guid>https://www.dinofizzotti.com/blog/2016-10-18-automated-blog-posts-with-hugo-gitlab-ci-and-docker/</guid><description>&amp;hellip;or how this blog is built and deployed.
Introduction I&amp;rsquo;m using the tools and methods described in this post because I wanted to learn more about Hugo, GitLab CI and Docker. I don&amp;rsquo;t claim this is the best way of combining these technologies, or that everyone should do it this way.
I wanted to create a blog and publish content in a way that felt fun, and at the same time learn something new.</description></item></channel></rss>