Love the post and the series! I hope you come up with something similar in the future.
I also have a question. When you wrote
‘’’’A critical factor at play is the inherent “locality inductive bias” in neural networks, which can limit their ability to capture patterns that aren’t naturally localized in a tabular structure.’’’
Isn’t locality inductive bias only present in CNN or RNN architectures but not in fully connected networks and transformers?
Great post!
Love the post and the series! I hope you come up with something similar in the future.
I also have a question. When you wrote
‘’’’A critical factor at play is the inherent “locality inductive bias” in neural networks, which can limit their ability to capture patterns that aren’t naturally localized in a tabular structure.’’’
Isn’t locality inductive bias only present in CNN or RNN architectures but not in fully connected networks and transformers?
I have been in the data science world for a short time and this kind of posts are helping me a lot to understand the fundamentals of ML and DS.
What are your honest thoughts on TabPFN?
Great stuff and very much agree.
Until neural networks can learn to sort and discretely partition attributes with few computations, XGBoost will continue to be all you need.