I make machines learn; better with graphs. PhD student at @EPFL_en with @trekkinglemon. Open science, source, data.

🇨🇭 Switzerland
Joined May 2015
Most improvements aren't meaningful. How to spot the ones that are? 🤔
Now that "Do Transformer Modifications Transfer Across Implementations and Applications?" has been accepted to #EMNLP2021, we can finally tweet about it! Paper 📝: arxiv.org/abs/2102.11972 Code 💾: github.com/google-research/g… Thread summary: ⬇️ (1/8)
Show this thread
0
0
0
0
The easiest notation is the most familiar one. That said, code is more formal than math. Everything is precisely defined, no ambiguity. A machine must run it!
I am really surprised by how revelatory this seems to many of the people in the thread... Isn't pseudocode notation as complicated conceptually as sum/product? Or is it an audience of programmers who all had terrible math teachers?
0
0
0
1
When data is structured by an uncertain domain, one can consider a distribution over graphs: a multi-layer graph whose layers are (weighted) realizations of the structure. Could also be learned. arxiv.org/abs/2108.09192 by Feng Ji, Wee Peng Tay, @elpenta.
0
1
0
6
Graph theory sounds cool!
Replying to @SC_Griffith
Graph theorists have a million different cutesy names for various types of graphs. "It's easily proved that a constellation that doesn't have any wombats is either a bipartite fishing pole or an acyclic hyacinth" No idea wtf they're on about
1
0
0
1
Even researching it sounds fun. 😅
a graph theorist told my prof that this is the secret of graph theory publishing: go to conferences & hang out with some friends and make up a kind of graph. call them lobster graphs or something who cares. then everyone goes home and proves a bunch of things about lobster graphs
Show this thread
0
0
0
0
Hypergraphs are "filling the middle of a kind of mathematical sandwich, bound on top by these ideas from topology [simplicial complexes], and underneath by the limitations of graphs."
0
0
0
5
to compete, from the Latin competere "strive together"
1
0
1
4
Protein imaging example [arxiv.org/abs/2104.06237]: translation invariance is built-in by convolution while noise invariance is brute-force learned by augmentation. Constraining the functional space is more compute/memory/data efficient. But knowing the invariants isn't enough:
Data augmentation is one of the ugliest hacks in ML. If you know what the invariances are, encode them into the architecture. Don't blow up the size of you dataset in order to approximate them.
2
0
0
9
We must know how to build them in. While we know how to build in spatial invariants (symmetries)—largely because physicists have been studying them for centuries—we don't know how to build in invariance to noise or illumination.
0
0
0
2
The open-source core scientific infrastructure is paramount to Science. That must be better recognized! A nice little step in that direction. #MarsHelicopter #Ingenuity
1
1
0
28
BTW that's no major contributions but fixing issues while I see them. Here's the strength of open-source, facilitated by @github: lowering the bar to contributing. Become a prosumer!
0
0
0
1
Where do you check a researcher's profile (in ML and CS)?
96% Google Scholar
1% Semantic Scholar
0% Microsoft Academic
2% Other(s)
85 votes • Final results
1
0
0
0
That's WL and MP on a multilayer graph whose vertices are the simplices and whose sets of edges encode the face/coface/adj↓/adj↑ relationships.
"Weisfeiler and Lehman Go Topological: Message Passing Simplicial Networks" has been accepted as a spotlight paper to the GTRL Workshop at #ICLR2021. Work w/ @ffabffrasca @wangyg85 @kneppkatt G. Montufar, @pl219_Cambridge @mmbronstein
Show this thread
2
0
0
6
They show that such MP on the clique complex is more expressive than MP on the graph. As the clique complex adds no info, that's due to a better exchange of messages: Graph MP only exchanges between 0-simplices through adj↑ relationships. MP is limiting!
The message-passing view of graph NNs, now embedded in researcher's minds and frameworks, is limiting. It emphasizes the local connectivity while we care about the global (or neighborhood) structure that emerges from that local connectivity.
Show this thread
0
0
0
2
Michaël Defferrard retweeted
Applications are open for the London Geometry and Machine Learning Summer School! Join LOGML 2021 where early career researchers will work in small groups with experienced mentors on a research project in the intersection of geometry and ML logml.ai
8
62
3
194
Show this thread
While there's a simplicial 🐟 in there, that's no April fools joke! (Yes I'm 🇨🇭, but the 🇫🇷-speaking kind.)
1
0
0
2
Show this thread