Hacker News

8 hours ago by heisig

Speaking of Common Lisp - the European Lisp Symposium starts tomorrow (May 3 and May 4, https://european-lisp-symposium.org/2021/index.html). The entire conference will be broadcast on Twitch. Python programmers are invited, too :)

5 hours ago by ollran

Thanks for sharing this. Call-site optimization for Common Lisp seems quite interesting after looking at the article.

http://metamodular.com/SICL/call-site-optimization.pdf

4 hours ago by jes

This is exciting. Thanks for the reference.

3 hours ago by agumonkey

Not to start a flamewar, every time I see a python talk (pycon or else), with fancy tricks like metaclasses.. all I can think is that, well, CLOS would have been perfectly fit for this too.

I know people are tired of the "lisp/smalltalk did it better" but what features of python are not possible (or hard) in CL[OS] ?

ps: how many CL shops are out there ? I'd work near free just to try a CL team once.

2 hours ago by dasyatidprime

CLOS (as you presumably know) models method application as calls to generic functions, instead of the now-more-mainstream Smalltalk-like message-dispatch approach which Python uses. The latter allows for things like overriding __getattr__ to intercept ‘all’ method calls and property accesses, for which I don't think there's any equivalent in CLOS.

The way methods are ‘attached’ to classes gives you a natural form of type-directed name lookup. CL generics have the advantage that you can define your own methods on existing classes while naming the methods in your own package so they don't conflict, but also have a curse of inconvenience along the way where importing a class doesn't naturally pull in everything associated with it, and you wind up writing the class name again and again when dealing with fields of an object. with-slots et al. are poor substitutes. (In an experimental sublanguage at one point I actually had local variables with object type declarations implicitly look up the class using the MOP and symbol-macrolet every available var.slot combination within the scope as a brute hack around the most common desirable case.)

Python's short infix/prefix operators are naturally generic, since they're implemented as method calls. In CL there's the generic-cl extension, but I haven't seen it have that much uptake… in particular, any library code that isn't explicitly aware of it won't use it ‘naturally’ on foreign objects, which could be good or bad.

That shades into the very-concrete type system that CL starts out with, where any attempt at ad-hoc polymorphic interop is a disaster unless everyone already agrees on what methods to use. I can't make a thing that acts like a hash table but uses a different implementation underneath, then pass it to something that expects to be able to gethash on it. I especially seem to get bitten by this in cases where alists are the expected way of representing key-value maps: there's no way to extricate yourself from the linear search without rewriting every piece of code that touches it, there's often an implicit contract that you don't want duplicate keys but it's easy to violate by accident and create bad behavior down the line, and so on. By comparison, Java collections in particular got this very right in terms of decoupling intention from implementation, and Python does basically the same thing but with a looser set of ‘expected’ methods.

By default, Python objects have a ‘purely’ dynamic set of properties, rather than the fixed slots CLOS imputes on an object via its class. Indeed the class-level property one can set in Python to constrain this for possible performance gains is called __slots__.

2 hours ago by agumonkey

thanks for your points, it's true that python dynamic operator genericity is very handy

an hour ago by kbutler

Beginner approachability seems to be the key feature. Possibly also integration with external libraries.

41 minutes ago by agumonkey

That's my take on it, but I've ran into people praising it like it was alien theory of everything dropped by gods as a gift.

I like python but I'm a bit fed up with the mobthink (how surprising).

3 hours ago by mark_l_watson

Nice article.

One thing, re “In Python we typically restart everything at each code change“: I sometimes run Python in Emacs with a REPL. I evaluate region to pick up edits. Not bad.

The big win for the Common Lisp REPL is being able to modify data, do restarts, etc. I usually use Common Lisp, but for right now I am heavily using Clojure to write examples for a new Clojure AI book that I am writing. I miss the Common Lisp REPL!

7 hours ago by PeterStuer

I like Lisp, and I'm not a fan of e.g. Python's whitespace sensitivity. That said, for niches such as ML and data science, I find you just can't beat the Python ecosystem.

3 hours ago by chalst

> for niches such as ML and data science, I find you just can't beat the Python ecosystem

There definitely are many areas where Python is best, but for ML and data science, Julia is (i) very competitive in library coverage, (ii) more performant and flexible, and (iii) has a very good Python bridge if it's needed.

I can imagine there are niches within ML and data science where what you need are Python-only libraries, you don't miss anything restricting yourself to the numpy type hierarchy and there's no advantage to calling the libraries from Julia, but I'm curious to check if that is what you actually meant and if so, what you are doing.

21 minutes ago by SatvikBeri

After using python for data science for 3 years (since founding the current startup), I mostly switched to Julia about 4 months ago. So far the only Python libraries I've really missed are boto3 and sqlalchemy for sql generation – both of which can easily be called in Julia using PyCall.

I think people often underestimate just how much faster Julia is than numpy, I've consistently seen performance improvements on the order of 10x-30x when porting code.

7 hours ago by rhlap

Are people actually doing science with Python or are they talking about doing science?

There's so much buggy low quality stuff in that space that I'd write a serious application in C or C++ from scratch.

It would be a custom application, sure, but not everything needs to be general.

Also, I find Lisp much more natural for mathematical reasoning.

7 hours ago by gustavo-fring

Yeah, they're very much doing it.

Pandas is huge, libraries like Spacy, NetworkX, etc exist. It's a massive and good ecosystem. Python is the goto for scientific computing in most of the sciences for newer students I'd hazard a guess over the older R and Julia.

This will be blindingly obvious if you work in that area. Yes, you can do it in another language, but you're missing out on a lot of stuff that is just done and is state of the art and is fast because the speedy parts aren't in Python. The complaints about parens for lisp are superficial, but it's my experience the same same goes for whitespace in Python. They just don't matter.

6 hours ago by tluyben2

> and is fast because the speedy parts aren't in Python.

Having worked months with a slew of senior data scientists, this was a bit painful. Python is so slow and those data scientists were very good at coming up with solutions for the issues of the company, but the implementations (using Spacy, Pandas and other libs) had enough Python in them to make them not practical for the company use case. Nice prototypes which I then had to fix them or even rewrite to C/C++(we worked Rust as well to try it out) to make them usable in the company data pipeline.

I think companies are burning millions (billions in total?) on depressingly slow solutions in this space by throwing massive power at it all to make them complete their computations before the sun dies out.

Example: we needed a specific keyword extraction algorithm for multiple languages; my colleague used Spacy and Python to create it. It took a couple of seconds per page of text; we needed max a few ms on modern hardware. He spent quite a lot of time rewriting and changing it, but never got it under 1s per page on xlarge aws instances. My version takes a few ms on average executing the same algorithm but in optimised c/c++.

Sure we could've spun up a lot more instances, but my rewrite was far cheaper than that, even in the first month.

7 hours ago by beforeolives

> Are people actually doing science with Python or are they talking about doing science?

People are actually doing it. And a lot of it too. Both in terms of data science (as a broad term that can mean a bunch of different things) and in terms of computation for specific scientific fields like physics or biology.

5 hours ago by 7thaccount

Very much actively doing so on my end where nearly all work in the industry is in Python, with some Matlab, C, C++, and Julia sprinkled in.

Python is a great high level language for basically everything, but hardcore low latency apps. I can parse text, connect to databases, do sparse matrix computations on massive matrices, calculate network flows, generate large node-graph diagrams, use a Python based API to connect to any vendor software I've seen, do any kind of statistical analysis thing I need with pandas, amazing and free IDE allows me to use a REPL, code editor, and data structure viewer with ease, Python notebooks for education...etc etc. I've frequently found that I can rewrite a vendor's 10k line C++ program in a few pages of Python as the built-in Python data structures make text parsing extremely flexible and simple.

5 hours ago by enriquto

> do sparse matrix computations on massive matrices

This is completely impossible to do in the Python language, unless you resort to external tooling written in C or Fortran. Sure, you can call these codes from Python, as you can call them from any other language.

3 hours ago by golergka

Yes. Python is pretty much the main tool in biology, for example. C or C++ would be abysmal for similar exploratory scientific takes.

3 hours ago by whalesalad

I use hot reloading with an iPython repl. I write my code in such a way that I can interact with any individual part of the system via a REPL. Lisp excels here, but you can have a decent approximation of a real-time evaluation loop going.

2 hours ago by wexq

I'm not a particularly experienced or good Lisp programmer, I can customize Emacs, but that's pretty much it.

However, I think this article is a bit skewed and not highlighting things Python has.

For instance, the standard library means that Python is more usable out of the box.

Also when you've got things like iPython or Jupyter it means you can get off the ground easily.

So, in the end, they're two different languages, and I do not think either is better of the two. Right tool for the job and all that.

2 hours ago by mplanchard

This point seems to be addressed in the “State of the Libraries” section[0]

[0]: https://lisp-journey.gitlab.io/pythonvslisp/#state-of-the-li...

8 hours ago by gustavo-fring

This might seem slightly unrelated, but I was reading Elixir in Action and one of the statements is along the lines of a debugger being difficult to use in its naturally concurrent environment. The Elixir strategy is to kill erroring processes, capturing their exit signals with supervisor processes and then possibly recreating a replacement process.

Can the common lisp condition system be adapted to Elixir? Is there an advantage to doing so? Is there some obvious tradeoff between the two I'm not expressing?

Thanks.

see this thread form HN for more about adapting the condition system elsewhere.

https://news.ycombinator.com/item?id=26852309

7 hours ago by alpaca128

Elixir is based on the BEAM virtal machine developed for Erlang. Restarting crashed (very lightweight) processes to handle errors is the normal way of doing things in that system.

LFE(Lisp-flavored Erlang) is an existing language that combines Lisp syntax with Erlang's backend, though I haven't used it myself yet.

4 hours ago by prionassembly

Obligatory reference to Hy / Hylang, a Lisp that compiles to the Python AST.

Daily digest email

Get a daily email with the the top stories from Hacker News. No spam, unsubscribe at any time.