MolBERT: Teaching Transformers to Read the Language of Chemistry2026-01-08T10:00:00Z•21 min read#machine learning#drug discovery#transformers#chemistry#smiles#bertHow researchers adapted BERT for molecular property prediction, turning SMILES strings into drug discovery insights
Using Embedding Models to Predict Sentence Complexity2024-10-24T22:41:07+01:00•14 min read#nlp#embeddings#machine-learning#readability#bert#transformersTraditional readability formulas miss the mark. Modern embedding models can capture semantic nuance and syntactic structure, but do they actually predict complexity better?