System/GPU recommendation

I am not sure if this is the right thread to post this.

I’m currently a Masters student and I can say I have just been 1 year into deep learning, my interests are in embedded AI and TinyMl which is why I have completed the two HavardX courses. My masters thesis will be on reinforcement Learning(Wireless Network rate and Power control) so I have decided to invest on a machine but I am financially constraint to about $2500. What GPU or Rig will you recommend for me. I am currently considering Alienware with RTX 3060 6GB and RTX 2080 TI 11Gb with only $250 difference.

Hello @Predstan,

Allow me rephrase your question, please. You want an answer to allocating budget (with recognizable constraints) on CPU, GPU or TPU? (yes, the latter should be a selection criterion if you’re diving into ML; Google Voice Kit v1.0 not v1.1 on RPi was quite an education for me). If you are not into gaming or graphics pipeline rendering then so easy on GPU because there will always be a faster one around the corner and you’ll be hung up in the second guessing loop.

My recommendation (impractical for the current environmental conditions) is to do some very basic hands-on assessment at a local bricks-n-mortar store to gauge your comfort level with the hardware. Alas, you will not be able to run your favorite ML tests there but most of these stores have the basic benchmark tests but you will notice a perceptible difference between the different grades of GPU. For ML, as you understand very well, GPU is the all-rounder to borrow a sports term. Again, there are only two gorillas to consider for the underlying technology but the OEMs have varying warranties and support. It would be in your interest to include these as secondary but important selection criteria.

Kind regards.

2 Likes

Than you, will look into this.

You should look into Intel Neural compute stick 2. the cost is only around $150 however I am having some dificulty in understanding the software API. Main advantage it can be used with several different machines and I am assured people have it working on RPi3.

Here is the link for folks looking for it:

I thought there is a neural computing stick 3 version, but I can’t find it.

I think it uses the OpenVino software toolkit.

Is the Intel Neural Compute Stick used to train models, or just make faster inferences?

Good question, I’d like to understand that as well. I’m assuming inferences that you’re running locally but still training your application in a cloud environment.

Intel NCS is meant solely for inference at the edge/endpoint.

1 Like

Thanks! For those of us who are interested in training models locally (as opposed to using colab), do you have hardware recommendations?

My new desktop is a work in progress and doesn’t have a discrete GPU yet, so I’m contemplating options.

@Predstan
Not sure if this is helpful but here is my take.

How do you plan to parallelize the reinforcement learning algorithm for your use case? If you can spin off parallel environments, then it is also worth considering going with a good CPU SKU that has a decent number of cores. Each parallel env is a process that will take 1CPU/environment process.

Also, it is important to consider the GPU memory. 6 GB is very small and you will quickly find trying to train a large policy will not be possible on RTX 3060. So I would go with GPU with larger memory (i.e., RTX 2080 TI with 11 Gb).

Currently, there’s a worldwide shortage of modern NVIDIA GPU’s- driven by a cryptocurrency mining boom.
You should state what your experiences are using COLAB
It’s possible to see whether you are core constrained or memory constrained.
4GB of GPU ram might be plenty for you - in which case an old 1050Ti would be great.
If you need more ram- then look for a used 1080Ti or a newer 3060.
Be prepared to pay 2-3x MSPR for CPUs right now. yup- $700 to $1000 for a RTX3060.

Note- you’re going to have a steep learning curve when installing Linux and packages on your new machine. Budget a few late nights with two computers side by side so you can read up on how to get unstuck.
That said- after a year of working in COLAB it’s a good time to expand to a dedicated workstation