Very insightful paper(https://t.co/WCbSibCwux) for comparison between XGBoost and Logistic Regression (LR) for binary classification tasks, particularly in imbalanced datasets like burned area detection from satellite imagery.
— Śreyāśḥ D 🚀 𝗲/𝗮𝗰𝗰 (@xegression) July 20, 2025
Some key Points Summarizing the comparison: pic.twitter.com/tFfjq8Bv9h
In 1886, Francis Galton observed that children’s heights tended to be closer to the average than their parents’.
— Śreyāśḥ D 🚀 𝗲/𝗮𝗰𝗰 (@xegression) August 5, 2025
This “regression toward the mean” led to the term we now use in statistics: regression. pic.twitter.com/ezkn6w9bDO
One of the most cited papers, which introduced random forest by Leo Breiman (though idea was there long ago).
— Śreyāśḥ D 🚀 𝗲/𝗮𝗰𝗰 (@xegression) July 16, 2025
He also has website for it and just got to know that Random Forest itself is not patented,but the name is trademarked. https://t.co/X0fXWL4uWJhttps://t.co/AM5wpPSRMC pic.twitter.com/NH5eQ5H4M9
If u are tired of ML predicting models, here are some project/paper ideas
— Śreyāśḥ D 🚀 𝗲/𝗮𝗰𝗰 (@xegression) August 5, 2025
📘 Study the following material( lecture+paper+code) on martingale posterior distributions and
✍️ Write a paper applying the framework to path-dependent European options ( e.g., lookback options)
1/3 pic.twitter.com/TXKx9DJ9bT
It always amazes me how one day George Dantzig arrived late to a statistics class, mistook two unsolved problems on the board as homework, and ended up proving them, which ultimately became his PhD thesis.
— Śreyāśḥ D 🚀 𝗲/𝗮𝗰𝗰 (@xegression) June 28, 2025
might be one of the fastest thesis completions in academic history. pic.twitter.com/SC8S0IHTK0
No showcase items to display.