At home, I use machine learning /AI for mainly research purposes as I have the freedom to experiment new ideas or learn new networks without any external constraint and requirement.
The only requirement I sort of imposed to myself is that AI infrastructure must be keep it as simple as possible.
First, because I am its sole user hence deployment and packaging are not an issue. Second, I want to spend most of my time creatively. I am an engineer and a landscape photographer. Creativity is the reason I like fiddling with machine learning.
A « prepare, compute and store » workflow
My AI infrastructure is based on a « prepare, compute and store » workflow. The prepare stage involves preparing my dataset and Jupyter notebooks on my MacBook and controlled under GIT. Sometimes I also use my iPhone to prepare the dataset with iOS shortcut app and store the dataset using the JSON format. The compute phase is the most time consuming phase and I use either TensorFlow or Keras under docker. It is unwise to store the gigabytes under the docker image, hence I use a NAS for storage.
There are cloud based solutions such Domino, Google collaborate, etc which could certainly improve and speed up my infrastructure. However, at home, I am not targeting for a production platform.