In this post, I will discuss the steps I took to make my own LIFX web application using the Flask web framework.

Image for post
Image for post
Image by Markus Winkler

This is a follow up post from last week, but you can download the repo. To recap, we used the Python and the LIFX API to control different requests sent to our wifi-enabled lights. In order to create an application we have to look at all the components. I usually start with a rough sketch or a whiteboard.

Here’s my whiteboard (don’t judge…):

In this post, I will share how to handle LIFX API Requests with Python

Image for post
Image for post
Image by Kreeson Naraidoo

To start, we need to access the LIFX HTTP API documentation. You will need a set of LIFX lights for this to work. All requests require an OAuth 2 access token from your account settings.

First we will need to create a global variable to store our access token. Then we can create methods to handle the different API requests.

import requestsclass LIFX(object):    def __init__(self):
self.token = "your_access_token"

I want to be able to turn the lights on/off, and also switch to the preset light settings I’ve already configured on my phone. Before I begin, I will need…

Compare the top 200 supported assets on Kucoin’s decentralized exchange.

Image for post
Image for post
Photo by Eftakher Alam

Everyone knows that one person, or couple, or teenager, that made millions on Bitcoin. While bitcoin is still the most popular digital asset with the highest market cap, there have been several advancements in blockchain technology since the world was introduced to the distributed ledger.

I am a crypto enthusiast and have a few friends that try to keep up with the crypto market like the stock exchange; except the crypto markets are open 24/7, and there are thousands of projects on hundreds of exchanges. …

In this post I will use the unsupervised learning algorithm from Scikit-Learn, KMeans, to compare Houston Artists using the Spotify’s Web API.

Image for post
Image for post
Photo by ThisIsEngineering from Pexels

I will also walk through the OSEMN framework for this machine learning example. The acronym, OSEMN, stands for Obtain, Scrub, Explore, Model, and iNterpret. This is the most common framework for Data Scientists working on machine learning problems.

With out further ado, let’s get started.

Obtaining Data Using Spotify API

First, we use the Spotify API client to obtain our data. If you are not familiar with Spotify API, you will find the CodeEntrepreneur’s 30 Days of Python — Day 19 — The Spotify API — Python TUTORIAL very helpful, especially since the documentation is written in JavaScript. …

Image for post
Image for post
Photo by Author

In this post, I will share how to deploy a pre-trained model to a locally hosted computer with Flask, OpenCV and Keras. I initially deployed this model on PythonAnywhere using Flask, Keras and jquery. The application was designed for remote school classroom or workplace settings that require students or employees to shave their facial hair.

The application allowed users to upload a photo, and click a button to send a post request with the encoded image data to the backend of the website. The image transformation and classification were handled on the backend, and the results were returned to the…

In this short post, I will share how to split and join a pickled model file using python and the os library to bypass upload limits on PythonAnywhere.

Image for post
Image for post
Image by Kevin Ku

In my last post, “Building a Convolutional Neural Network to Recognize Shaved vs UnShaved Faces”, I ended the article sharing the method I used to save my final trained model with Pickle.

“Pickling” is the process whereby a Python object hierarchy is converted into a byte stream, and “unpickling” is the inverse operation, whereby a byte stream (from a binary file or bytes-like object) is converted back into an object hierarchy. — Source code: Lib/

As a refresher, here is the line of code to pickle your final model, saving all the weights, without the hassle of saving the structure…

A code-along guide to build a CNN model using computer vision and the Keras deep learning API.

Image for post
Image for post
Image by Josh Riener

In this tutorial, we will use an image dataset created by scrapping free stock photo sites. The image set contains about 2,000 images of individual people labeled as “shaved” or “unshaved”. We will combine computer vision and machine learning to classify images using a Convolutional Neural Network (CNN).

By the end of this tutorial, you will be able to:

  1. Build a CNN model from scratch and using transfer learning
  2. Visualize model structure, hidden layers, and evaluation metrics
  3. Save your CNN Model for re-use, and/or deployment

Importing Libraries & Loading the Image DataFrame

To code along, download my image dataset from Google Drive. All image data is stored…

A code-along guide to download images from Stock Photo sites using Selenium and Python!

Image for post
Image for post

This post was inspired by Fabian Bosler’s article Image Scraping with Python. Fabian does a great job explaining web scraping and provides a great boiler plate code for scraping images from Google. For our purposes, we will focus on using selenium in python to download free stock photos from Unsplash.

Unsplash is a website dedicated to sharing stock photography under the Unsplash license. The website claims over 110,000 contributing photographers and generates more than 11 billion photo impressions per month on their growing library of over 1.5 million photos. Wikipedia

Since Unsplash is an interactive site, using Selenium would be…

Jacob Tadesse

Data scientist transitioning from a technology consulting career.

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store