Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

A framework is a weasel word for a Massively Coupled System. The trappings of orthogonality and modularity are given lip service, while actually creating some of the most anti-modular footprints in human systems design history.


This has all happened before. MFC used to be the way to structure your Win32 applications. Later on, Microsoft leaked a small library known as WTL that provided a lot of the UI niceties of MFC without a gigantic runtime DLL. More importantly, it didn't specify as much of an architecture. It became very popular; I'd attribute a big part of it as feeling non-monolithic.

The biggest disservice that Industry does to working class programmers is when it tells them that all of these 'old' practices of modularity/coupling are outdated/can't possibly work/too hard to learn/too academic/require writing too much code. They free developers to work faster and better, rather than shackle them to fashionable technology, keeping them in a perpetual state of engineering amateurism.

Worse, Industry has the gall to proclaim each small step as progress. It's all hype and bullshit, including your favorite framework.


I generally agree. But I think there exist good "frameworks" out there. I'd cite Bottle[1] as one. And while it's still new, I think Martini[2] has the beginnings of something great. Both of them heavily rely on principles of composition.

[1] - http://bottlepy.org/

[2] - https://github.com/codegangsta/martini


I agree, I think there is a movement afoot (if not named) that new frameworks will be truly decoupled -- being designed by war-worn veterans. The first frameworks were all massively coupled bells and whistles. The whole point of RoR is a pseduo-OO skin over a massively coupled Active Record system that can't really be extended without hula hoops. And Rails gurus love hula hoops.


One problem: users of open source love oodles of features, no matter how useless. Witness the sheer number of people who admit the primary way they choose a library is the last commit date and features list.

So if you write a solid library with just the right number of features, then it might get less consideration because you're not always duct-taping over poor design choices.

This is what happens when we glorify Internet Time. It becomes more about shipping and less about quality.


This is very true. It is a real issue in the open source world. http://github.com/codegangsta/martini tries to pride itself on minimalism. Thankfully the modularity of the project allows me to tell people to add features via other packages and repositories. Even though the product is solid it is difficult to communicate that the project is still active without having so many commits.

This is one of the reasons that I find the Golang package management philosophy refreshing in theory. "Master should never break" really prevents feature creep from coming in and promotes the use of solid packages that aren't always being actively worked on. Of course there are some major drawbacks wrt lack of versioning in Go, but I think the philosophy there overall is a very good thing for open source development.


Incredibly, people seem receptive to overly ambitious feature-creep-laden libraries even if they're completely half-baked. It's like they'd rather debug someone else's code than write it properly in the first place. IMO, there's few things more painful than when someone's library just doesn't work at all. The 'shiny' factor of communities usually indicates a lack of respect for good engineering. I much prefer communities that take coupling seriously; the only one I've found so far seems to be Clojure.

Please continue to push Golang away from fashion-oriented 'engineering.' I hope you take marketing seriously; it seems very possible for someone to create the next ultra-coupled-hack-of-a-Go-framework to rile everyone up and consequently forget all the lessons of minimalism.


Well, there's already Revel for that :)


I don't use the last commit date itself; I use how many issues, pull requests, and mailing list posts there are, how recent, whether the library author is replying, and how many others are using the library in their projects. For any reasonably sized library, there should at least be some of this happening. I don't want to take on the full development effort of the library by myself if I run into problems with it.


I disagree. I tried learning Catalyst for Perl a few years back. You could swap almost any component in it, and the tutorial explained how to. Being new to the whole concept of a web framework, I couldn't actually work out what the framework was doing, as it just seemed like a collection of libraries for various tasks. Learning Django after that and it all made a lot more sense. You can swap parts out, but it makes more sense, especially in the begining to use the sensible defaults provided. They are probably more coupled than they could be, but it works well, and the parts play nicely.

JavaScript frameworks seem to be another matter.


1. Let's review what we need for most web application, at minimum.

-- Some way to parse URLs coming into the system and route to the correct code that handles the response to that request. -- Some way to marshal and unmarshal form, request parameters, and more recently JSON. -- Some way to return HTML, after we have done what we need to considering the inputs we received. -- Some way to handle cookies. We probably also want a convenience layer so that we can have some sort of session to make authentication and authorization easier. -- Some way to interface with a more fixed storage, usually some form of a database. It would be nice to have a set of convenience methods to handle prepared statements. -- It would be nice to have some sort of way to escape text going to and from the fixed storage to help prevent XSS attacks. -- It would be nice to have a way to conveniently handle CSRF attacks.

Now, we could have a library for each one, or maybe 20. But that means every new project, we're making 7 (at a minimum) libraries, evaluating them for security, keeping on top of security updates for 7 projects, and learning 7 (or more) fundamental libraries each time we come on to another team because someone made different choices. On top of that, we've written glue code on top of this all to make our lives livable. The current state of code ownership means that we probably can't take this from employer to employer, so we'll have to write it all over again, or learn someone else's glue code with its own idiosyncrasies.

All for things that matter quite a bit, but I'd prefer a single good implementation over having to search for the 7 best. Further, what happens when one of these projects goes dormant? It's easy to rip everything out if you've written your code modularly, but what about that junior programmer's code from before you were there?

And are we supposed to thrust this all on a junior programmer who is just starting out? That's how PHP happened. ;) Not every programmer is gifted with a good sense of architecture, and most frameworks at least enforce a Not-Terrible architecture. When taking over someone else's code, this can be a very good thing.

Frameworks, as massively coupled as they are, have some distinct advantages for getting things done in 95% of systems. If you're running a system for which the defaults don't work, a framework isn't right for you. That's fine. Depending on how far out you are from the opinions, you can either cobble together your own solution or write it from scratch. But for most use cases, there are more advantages to that massively coupled system than disadvantages.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: