State of the Art in AI ResearchWho publish state of the art AI research these days? The question is frequently asked. Of course, large companies contribute but tend to be…Oct 4Oct 4
LLM/RAG: Knowledge Graphs, Multi-Agents, Ultrafast Fine-tuning, No LatencyIn this presentation, I explain all the components of a ground-breaking architecture with applications to local, enterprise LLMs and…Jun 21Jun 21
Free GenAI course with deep tech dive into the new generation of LLMsThe GenAItechLab Fellowship program allows participants to work on state-of-the-art, enterprise-grade projects, entirely for free, at their…Mar 22Mar 22
My Top 10 GenAI Articles of the YearHere is some good reading for the holiday season. More than just reading as the material includes full Python implementations and datasets…Dec 22, 2023Dec 22, 2023
GenAI-Evaluation: New Open Source Python Library Now AvailableTested on multiple public data sets with my own NoGAN synthesizers (1000x faster and consistently better than solutions offered by…Sep 22, 2023Sep 22, 2023
Generative AI Technology Break-through: Spectacular Performance of New SynthesizerI introduce a new, NoGAN alternative to standard data synthetization methods. It is designed to run faster by several orders of magnitude…Aug 2, 2023Aug 2, 2023
Generated Data vs Monte-Carlo Simulations: What are the Differences?I sometimes get asked this question: could you use simulations instead of synthetizations? Below is my answer, also focusing on some…Aug 1, 2023Aug 1, 2023
Sound Generation in Python: Turning Your Data into MusicNot long ago, I published an article entitled “The Sound that Data Makes”. The goal was turning data — random noise in this case — into…Jul 14, 2023Jul 14, 2023
A Synthetic Stock Exchange Played with Real MoneyNot only that, but you can predict — more precisely compute with absolute certainty — what the value of any stock will be tomorrow…May 16, 2023May 16, 2023
Massively Speed-Up your Learning Algorithm, with Stochastic ThinningYou have to see it to believe it! Imagine a technique where you randomly delete as many as 80% of your observations in the training set…Apr 17, 2023Apr 17, 2023