Just a question for a project of mine, but can tabular data be combined with NLP? Or should I train two seperate nets (i.e. with ulmfit) and try to combine the results?
You can one-hot encode categorical data like strings to build one big model. But if the data is independent, or if the tasks are orthogonal then you might be better served if you create two models.
I think most people would categorize that as machine learning or data science, not deep learning, maybe you would do better searching under those keywords
What is a good course that focuses on "tabular data", in particular predicting continuous outputs from continuous inputs, aka regression?