Battle Page Bloat

add:
http://digiday.com/sponsored/5-ways-publishers-tech-choices-come-back-haunt/

And all things that break basic web functionality

The botching of the mobile web experience isn’t the phone broswer, it’s the web developer. Developers need to stop being lazy.


HN comment:

Am I in the minority that I don't want an app for every. bloody. webpage. I visit?

Another HN comment:

I also don't want every mobile webpage I visit to use some slow janky JavaScript framework to emulate native app behavior either because in my experience the user experience for those are universally worse than just trying to be a relatively normal web page (perhaps with some media queries for image sizes, etc) and letting the mobile web browser do its thing.


Some people believe that Facebook's Instant Articles will further hurt the web. Maybe, but the web began getting damaged years ago with bloated web designs that created slow, clunky, horrible user experiences.

Media sites would have functioned better with a plain, vanilla 1995 look with bare bones HTML. Add a smattering of CSS with media queries, and it would be possible to create small, lightweight, fast-loading web pages in 2015 that are still comfortable to read.

I like client-side JavaScript for logged-in dashboard functions like what's used with my Digital Ocean account. Their usage of JavaScript is elegant. It's not overdone. They don't use JavaScript just because they can. The JavaScript usage serves a purpose and makes administration of my droplet pleasant.

Ditto for my Fastmail.fm account, which is my favorite email service.

For content sites where I don't log into the site, I don't understand the misuse and overuse of client-side JavaScript. That's why I surf the web with JavaScript and other things disabled by default, thanks to the NoScripts plugin for Firefox.

I also don't care for the overuse of images on websites' homepages and irrelevant images on article pages. And huge images are unnecessary most of the time, since most readers are accessing the sites on phones.

Over the past few years, media sites have redesigned to be responsively designed in order to have one website that functions well on all devices. This is good. But unfortunately, many of these sites bog down older computers with too much code bloat. Single web pages are slow to load and have a size in the megs. Absurd.

And I don't understand why many responsively-designed websites display in such a tiny font size on phones. It's an uncomfortable reading experience. What numskull opined about the need for small font sizes on phones with responsive design?

My prefs for media homepages and article pages:

No JavaScript.
Minimal CSS if possible. This is hard to do.
Small images most of the time.
Useful images that are related to the article.
Homepage with a stream or feed view.
Don't break the back button or create an abnormal action with clicking the back button like taking me back to the top of the site instead of where I left off.
Don't break the right-click or open in a new tab function. This occurs way to often, and it's infuriating. Repulsive.
Don't break highlighting and copying text for excerpting.

Since 2012 or 2013, the web experience has grown increasingly frustrating, thanks to bloated, obnoxious designs caused by the misuse of JavaScript. It seems like website owners have intentionally created miserable web experiences across all screen sizes in order to convince people to use their native mobile apps.

I'm sure that the bad user experiences on newly-designed websites is unintentional. I'm guessing that many site owners, designers, and developers deploy what's currently popular because it uses cool technology even though the tech does not always improve the reading experience.

"The user experience sucks but that animation is cool." - pointy-haired media boss

I definitely prefer a browser-based open web, including on mobile, but when obnoxious web experiences are created, this leads to Facebook innovating something new to improve the user experience.

I don't blame Facebook. I blame other site owners for building bloated, clunky websites over the past few years.

"the open web isn't well-suited for mobile devices"

"The mobile web is tough to navigate."

True for poorly-designed websites. Bloated and clunky with unnecessary JavaScript.

For pleasantly designed websites that are kept simple, the web works well on mobile, and the mobile web is easy to navigate.

Simplify and make the experience comfortable. Focus on the individual article page. Strip it down to the bare parts or skeleton:

article header:
title
maybe the author
maybe the created and updated dates
article body:
obviously, the content.
article footer:
maybe the author
maybe the created and updated dates
maybe a way to contact the author
maybe a way to share the article
Nothing else is needed on the page, except for a "Home" link, located in the upper left or upper right corner of the page.

Use little to nothing in the web site footer area of the article page.

Save all of the website header, navigation, and footer info and links for the home page. Why repeat all the crap on the article pages?

Keep the article pages as lightweight and simple as possible. Let the negative space shine.

Embedded images and videos that are part of the article body will get loaded in more slowly than the text, but at least the text displays quickly, especially if the article page is pulled from cache.

But media sites add dozens of trackers and other "things" to a single article page. It's offensive.


Google's AMP project may be proof that websites have been overly engineered to a massive fault for too many years. And it may be proof that website owners have been more concerned with displaying new tech skills, instead of exhibiting empathy for the readers.

Site owners should create a humane web experience.

The delicious irony here. The horribly bloated and slow-loading Verge.com published a story about Google speeding up the mobile web.

Today, the company announced a technical preview for a system called Accelerated Mobile Pages (or AMP), designed to fight many of the factors slowing and bloating mobile web pages.

Google is trying to fix the problems created by website designers, developers, and probably the ever-present "stakeholders."

If the system works, users should see lighter, faster-loading mobile web pages as a result.

Website owners can do this on their own by limiting or eliminating JavaScript exposure to the browsing-only users and reducing the amount of giant images that get downloaded. With some effort and common sense, you don't need help from Google to make websites load quickly on the mobile web.

http://babyutoledo.com :

I created babyutoledo.com for a local non-profit group to be simple and lightweight. The focus is on the content and the org's purpose. I wanted the reading experience to be comfortable for the users on any device.

I'm sure that I can do more to streamline the pages and CSS to speed up page download speed.

Some of my niche blog sites also use memcached.

Keep it simple and lightweight, and the pages load well on mobile.

I created this static page with no JavaScript as an example of what I would like to see produced by the Toledo Blade. I would pay a hefty digital subscription if I could get pages like this.

http://toledotalk.com/last-alarm.html

It's interesting that website owners may be thrilled to receive a solution to a problem that they created.

If you are smashing your left hand with a hammer that you are holding in your right hand, you should not be thrilled to receive some padding to protect your left hand while you continue to hammer away at it. Obviously, the smart thing to do would be to drop the hammer. And don't blame the hammer manufacturer for your ailing left hand.

Don't misuse technology to create a problem. Don't whine about the technology. And don't hope that someone creates new technology to solve the problem that you created.

Thanks to Google AMP, does this mean that website owners, designers, and developers can continue to create poor UX experiences because they can rely on other services to do the work that they refuse to do?

Brilliant analysis.

https://mobile.twitter.com/om/status/651759921990647808

Publishers embracing Facebook, Google & Apple platforms are merely shining light on their own technical incompetence


The Web or the web is not slow. Browsers on desktops, laptops, tablets, and phones are not the problem.

Bloated, poorly designed websites are the problem. And publishers create them. These stunningly huge article pages bog down old computers and cause slow downloads.

The solution is to create simpler websites.

My test site that's using my new formatting app to create static html pages loads fast and works fine on desktop and phone.

My other sites also load fast because the Nginx web server pulls the homepages and the article pages from Memcached. And if the article page was not cached, then my code is accessed, which pulls the content from the MySQL or CouchDB database, and then the content is cached before or after it's sent to the user. A refresh retrieves the newly cached page. But when I create or update an article, that page is cached. Other stream views, such as page 2 and on and search results, are not cached.

Make the web faster? No. Make your websites leaner. Make your websites reader-friendly instead of reader-hostile.

Memcached and Redis are simple to implement and use. Does every article page need to contain dynamic elements? The answer is probably 'Yes', especially for all the tracking, advertising, and analytics crap that users must download.

http://www.wsj.com/articles/google-starts-including-amp-content-in-mobile-search-results-1456326159

“We want to make it really easy for publishers of all shapes and sizes to publish AMP-formatted pages, from the New York Post all the way down to people running their own personal blogs,” said Paul Maiorana, vice president of platform services at WordPress.com parent company Automattic.

Why not create simple, fast-loading, reader-friendly pages by default?

Under the hood, AMP works by simplifying and streamlining the HTML code that powers Web pages to prioritize speed. Google also “caches” pages, or saves copies of them on its own systems, in order to deliver them quicker when users access them. It’s an open-source initiative, meaning anyone is free to use it.

Again, content producers on their own can create simple HTML pages, and they can cache their own content for faster access, thus creating a reader-friendly experience.

http://www.niemanlab.org/2016/02/diving-all-in-or-dipping-a-toe-how-publishers-are-approaching-googles-accelerated-mobile-pages-initiative/

“Mobile web performance is bad — I challenge you find someone who disagrees with that,” Mic’s chief strategy officer Cory Haik told me.

I would vehemently disagree. The mobile web performance is not bad. Your obnoxiously bloated and clunky website is bad. You create a reader-hostile experience and then blame something else.

“When our pages load too slowly on mobile, as a publisher, we’re losing an audience, and that is painful. So we’ve been excited to build on AMP.”

Cuckoo time.

Somebody gets it right.

https://backchannel.com/google-is-going-to-speed-up-the-web-is-this-good-a92a6043598b#.btthbzw7l

“Sluggish” is too tame a word for what we endure now, due to an accumulation of terrible news-industry design and business choices in recent years.

Before getting into details about what’s happening here, let’s be clear on something. AMP wouldn’t be necessary — assuming it is in the first place — if the news industry hadn’t so thoroughly poisoned its own nest.

Looking for money in a business that grows more financially troubled by the month, media companies have infested articles with garbage code, much of it on behalf of advertising/surveillance companies, to the extent that readers have quite reasonably rebelled.

We don’t like slow-responding sites, period.

On our mobile devices, which are taking over as the way we “consume” information, we despise having to download megabytes of crapware just to read something, because the carriers charge us for the privilege.

That’s one reason why we use ad blockers. (The other, at least for me, is that we despise being spied on so relentlessly.)

The news business could have solved this problem without racing into the arms of giant, centralized tech companies. But it didn’t, and here we are.

What if news sites had just done the right thing in the first place? Or, since they didn’t, what if they just resolved to build faster pages — using standard HTML markup and loading components in a non-annoying way — now?

Empathy for readers means not burdening the readers with massive article pages that are bloated with unnecessarily huge and maybe irrelevant images, along with a ton of javascript, trackers, etc.

Simplifying leads to faster performance and a better reading experience. Speed is only one aspect of this.


more thoughts about Google's AMP project:

Hilarious. Publishers need to rely on other companies to serve fast web pages on mobile. That's because publishers create some of the worst designed and most bloated websites on the internet.

Here's how to speed up an article page:

I use memcached to cache article pages and page one of the home page stream. That means that a user will interact only with Nginx and memcached.

If the page is not cached, or the user hits page 2 or higher of the home page stream, or the user conducts a search, then my apps pull the content from a database, such as MySQL or CouchDB.

If it's an article page that is not cached, then my code knows that and places the article into cache. A refresh would get the article from memcached.

When I'm logged into one of my web publishing apps, then each article page is dynamically created by being retrieved from the database.

And when I'm logged in, I access JavaScript only when I use the browser-based JavaScript editor, which is what I'm using right now to create this post in my Junco app.

I love, actually thoroughly enjoy writing on the web by using this JavaScript editor that I "created" by downloading someone else's code and greatly modifying it.

The editor that I downloaded was a Textile live preview split-screen editor. I disliked the distracting live preview. Plus, my apps support Markdown/MultiMarkdown and my own custom formatting commands, and I didn't want all of that smarts in the client-side JavaScript. I have no problem with using the server to render content. It only takes about one second. What's the issue with server-side rendering?

I modified the JavaScript editor to support an optional single-screen mode. I added buttons to the top to support these options. I added keyboard shortcut commands for previewing, saving, and switching screen modes.

I added a true full-screen mode. I added a narrow, five-line screen mode.

I added the ability to change from dark on light to light on dark, regarding text color and background color.

I added autosave to the editor. By default, the editor saves the markup every five minutes. I can change that time by using the autosave= command and then specify the seconds. Autosave kicks in when an addition or deletion is made. If no changes, then no autosave.

I used the minified.js framework to implement most of my changes to the original JavaScript editor.

I should someday rewrite the editor to use only one JavaScript framework and to eliminate code from the original editor that I no longer need.

I mainly write in splitscreen mode, which allows me to preview the page in the right pane. But at times, I also write in single-screen mode but not in full, single-screen mode.

The nice thing about the editor is that I can use it on my iPhone 5c to create and edit pages. On the phone, I write in single-screen mode. Split-screen is too small on the phone.

And I enjoy using a database. It makes for easy searching. I tag heavily. I often use string and tag searches to find my content.

Using a database makes it easy to create and edit content from any device. I don't need to be tied to only one machine, such as a laptop/desktop.

The database makes it easy to store versions and conduct differences and revert to old versions. This exists in my Junco and Grebe apps. I should add versioning to Scaup and/or Veery, which use CouchDB and Elasticsearch.

I use memcached to speed up page display and to make access as fast or as nearly as fast as accessing a static HTML page.

Geeks love using web publishing apps that write pages to the file system and do not use a database. I think that they like to use complicated things.

Using an SQL or a NoSQL database makes life much easier for me. I'm about designing to make things easier to use.

I wonder if accessing a memcached page is faster than accessing a static HTML page on the file system.

Wouldn't the web be significantly better if publishers created simpler article pages by using text as a design technique?

Progressive enhancement. Responsive design. No JavaScript for users who are not logged into the site. Simple CSS. Simple HTML. A lot of negative space to allow the article to breathe. Comfortable line width, line height, paragraph spacing, and font size on ALL devices. This all ends in the trash can if the publisher uses a microscopic font size on phones, which many responsively-designed sites do use for some idiotic reason.

It's as if designers/publishers follow some nutjob paradigm that believes that the smaller the device, then the smaller the font size. Why? No, don't do that. Not everyone that reads a site is 20-years-old with perfect vision.


Webpagetest.org

Another testing service:

https://developers.google.com/speed/pagespeed/insights

http://m.toledoblade.com/Featured-Editorial-Home/2016/03/06/Yes-on-Toledo-tax.html

Speed:

Mobile = 33 / 100
Desktop = 64 / 100
I don't understand. The mobile version of the Blade's website is slower on mobile than their desktop version, and the mobile version is faster on desktop than their desktop version. At least according Google's PageSpeed test. Whatever, it's all very slow.

http://toledotalk.com/yes-on-toledo-tax.html

Incredibly boring webpage, but it's the same content.

Speed:

Mobile = 99 / 100
Desktop = 99 / 100
ToledoTalk.com is hosted on a shared server at HE.net, meaning that TT shares a computer that also hosts dozens of other websites. Obviously, I don't have root access. The speed rating would be 100/100 if I could enable compression within the Apache web server.

For my other sites, I host at Digital Ocean, and I have total access to my own virtual server. I use the Nginx web server. That same HTML page does receive a 100/100 score.

At webpagetest.org:

Fully Loaded:

Time: 0.535s
Requests: 2
Bytes in: 14 KB
Apache at HE.net must be adding some additional info because that
boring ass web page is a little over 8,000 bytes in size on the file system, due to the HTML tags and some CSS.

When the version at Digital Ocean is tested, the Bytes In equals 5 KB, due to Nginx compression.

The same op-ed:

5,000 bytes downloaded versus the Blade's version which requires over 5 megabytes to be downloaded.
Fully loading in under a second versus requiring over 40 seconds to load completely.
Making only 2 requests versus over 900 requests!!! (what in the hell is going on there?)
I'm always impressed by consumers' ability to tolerate mediocrity and abuse.

I feel bad for the writers, editors, and everyone else at newspaper orgs. Their service is needed at the local level, in my opinion.

But this kind of web design indefensible, and I would never support it with money.

I have no sympathy for media orgs if their business declines when they abuse web readers like this.

Writers and editors may feel that they cannot do anything about their company's wretched web design, but I disagree because I assume that those people also use the web, and they work at these companies.

Anyway, the keys are no JavaScript, no tracking gobblygook, and no ads.

I suppose that I would be the only person willing to pay a hefty annual subscription fee for content that was displayed that simply. Photos and illustrations are still welcomed. In fact, more images should be posted. Just greatly simplify the delivery container.

A fast, simple delivery mechanism does not improve bad writing. But good writing, important writing can be lost or ignored when the delivery mechanism is a train wreck.

Even digital-only media sites that formed in recent years and never created a print version are designing massively bloated websites.

But I see no improvements in the future by newspaper/media orgs, regarding their reader-hostile web designs. The only change will be that the websites will get worse.

Thanks to slow websites, we will have more services like Facebook's Instant Articles and Google's Accelerated Mobile Pages because most people read on their phones.

Some day, it may be pointless for news orgs to have their own websites because the media orgs will publish their content on many other platforms. It's not Facebook's fault. It's the fault of the publishers for designing web pages in 2016 that are 5 to 15 megabytes in size.

And if the local, daily newspapers close, I'll get my "news" by overhearing pointless third-hand drivel from people who read Facebook.


and more thoughts:

Single article "pages" can be several megabytes in size. And the UI/UX can be incredibly clunky. The media sites can bog down older machines because pounds of useless JavaScript, trackers, etc. get downloaded.

JavaScript is not the problem. The misuse and abuse of JavaScript is the problem. Media orgs have created reader-hostile websites.

For a comparison, here are the webpagetest.org results for yesterday's humorous Blade op-ed.

http://www.toledoblade.com/Featured-Editorial-Home/2016/03/06/Yes-on-Toledo-tax.html

Check these numbers out for loading a single op-ed.

Fully Loaded:

Time: 44.201s
Request: 952
Bytes In: 5,018 KB
Cost: $$$$$
A single page makes 952 requests and ends up being 5 megabytes in size. Holy hell. Media orgs are conducting a war on the web. That's probably why every damn site wants us to download fat-ass apps to our mobile devices.

I've created other websites for myself, and when I create or update a page, I store a copy into a caching server, which enables it to be served faster. But creating static HTML files with minimal assets can be faster still.

October 2015 post: http://idlewords.com/talks/website_obesity.htm

Let me start by saying that beautiful websites come in all sizes and page weights. I love big websites packed with images. I love high-resolution video.

I love sprawling Javascript experiments or well-designed web apps.
This talk isn't about any of those. It's about mostly-text sites that, for unfathomable reasons, are growing bigger with every passing year.

Here’s an article on GigaOm from 2012 titled "The Growing Epidemic of Page Bloat". It warns that the average web page is over a megabyte in size. The article itself is 1.8 megabytes long.

Here's an almost identical article from the same website two years later, called “The Overweight Web". This article warns that average page size is approaching 2 megabytes. That article is 3 megabytes long.

... consider this 400-word-long Medium article on bloat, which includes the sentence:

"Teams that don’t understand who they’re building for, and why, are prone to make bloated products."

The Medium team has somehow made this nugget of thought require 1.2 megabytes.

That's longer than Crime and Punishment, Dostoyevsky’s psychological thriller about an impoverished student who fills his head with thoughts of Napoleon and talks himself into murdering an elderly money lender.

Excellent post from May 2015

https://www.baldurbjarnason.com/notes/media-websites-vs-facebook/

The web doesn’t suck. Your websites suck. All of your websites suck.

You destroy basic usability by hijacking the scrollbar. You take native functionality (scrolling, selection, links, loading) that is fast and efficient and you rewrite it with ‘cutting edge’ javascript toolkits and frameworks so that it is slow and buggy and broken.

You balloon your websites with megabytes of cruft. You ignore best practices. You take something that works and is complementary to your business and turn it into a liability.

The lousy performance of your websites becomes a defensive moat around Facebook.

The web is capable of impressive performance and offers a dramatically vaster reach and accessibility than anything an app can offer.

The web is not the problem. Mobile web is not the problem. Mobile web browsers are not the problem.

100 percent of the problem belongs to website owners.

https://eev.ee/blog/2016/03/06/maybe-we-could-tone-down-the-javascript/

Do web developers actually use web browsers?

I also use NoScript, so I’ve seen some other bizarre decisions leak through on sites I’ve visited for the first time. Blank white pages are common, of course.

For quite a while, articles on Time’s site loaded perfectly fine without script, except that they wouldn’t scroll — the entire page had a overflow: hidden; that was removed by script for reasons I can’t begin to fathom.

Vox articles also load fine, except that every image is preceded by an entire screen height’s worth of empty space.

Some particularly bad enterprise sites are a mess of overlapping blocks of text; I guess they gave up on CSS and implemented their layout in JavaScript.

There’s no good reason for any of this. These aren’t cutting-edge interactive applications; they’re pages with text on them. We used to print those on paper, but as soon as we made the leap to computers, it became impossible to put words on a screen without executing several megabytes of custom junk?


We might be better off if we designed sites with minimal HTML and minimal responsive design, creating what might be called responsible design.

A few suggestions:

Try designing the site without javascript, or test the site with javascript disabled, and ensure that the site still works with progressive enhancement.
Limit the use of giant images.
Make every image count. Don't use an image just because it looks cool when it has nothing to do with the rest of the content.
Don't break the back button.
Don't break the Ctrl-C and Ctrl-V for copy and paste for desktop/laptop users.
Don't create confusing and unfamiliar click/touch actions on links and navigation.
If you can't control yourself, then create a frigging native app instead of blanking up the web.

This might make web development boring, but the focus, however, should be on the content and not nifty animations.

Maybe this is how content providers can charge a fee for their content by offering the plain, simple, usable version of their site for a price. The free users get deluged with the bloated, ad-heavy, so-called sophisticated, modern version.

I understand that geeks like to incorporate their new skills into new and existing projects, but the hip tech-of-the-day should still provide value to the end users.

Too much of this new stuff seems like a solution in need of a problem. Geeks cannot find enough problems to solve with the new tech, so they create new problems by unnecessarily gumming up the works on web sites that did not have problems.

The web experience is becoming increasingly frustrating. Sites that once worked fine by my standards are now becoming so annoying that I may stop reading them, or I'll read their content only if it's provided in an RSS feed that I can view within my site here.

On my older laptop, I use Firefox with the NoScript plug-in, so that by default, I view every website with JavaScript disabled. This speeds up page load time dramatically.

Through the NoScript plug-in, I can enable some or all scripts for the page or the entire site either temporarily or indefinitely. The control is more with me.

For sites that fail to work without JavaScript, I will either enable JS if I like the site, as with Medium.com, or I simply move on, since it's the World Wide Web.

On mobile, however, I primarily use the Safari browser on my iPhone with JS enabled. By mobile, I mean my phone, since I no longer use a tablet. I read for long periods on my iPhone.

Well-designed sites are typically responsively-designed, so they function fine on my smartphone, although I wish designers would trend toward a larger font size for the phone.

In recent years, the font size and line-height have increased for the desktop/laptop versions of websites. But in my opinion, some responsively-designed sites use a font size that is too small for the phone.

When viewing some websites on my laptop, I resize my browser to get the "mobile" version of the responsively-designed site because it functions and looks better than the full-size version.

On my phone, I have little patience for web sites that are not responsively-designed. It's nearly 2015, and thankfully, I'm encountering non-responsive sites less often.

Mar 2015

Pinterest website is a web abuser while simply viewing the site as a browsing-only user.
https://www.pinterest.com/evanka/knitting-loom-crochet-and-anything-yarn/

Twitter's website is a web abuser on the desktop/lapop and on the phone. Unacceptable back-button usage. Can't open page in the background on the phone. Can't copy text with JavaScript disabled. Infuriating.

Hilariously stupid that such huge properties fail at basic web design that existed more than 20 years ago. No matter. I don't need these properties. It's a "world wide" web.

This web page about tablet weaving is superior in design compared to what Twitter and Pinterest creates. That's because this tablet weaving page is more aligned with the true spirit of the web.

http://www.shelaghlewins.com/tablet_weaving/TW01/TW01.htm

Curl

http://daniel.haxx.se/blog/2015/03/20/curl-17-years-old-today/

https://news.ycombinator.com/item?id=9236551

HN comments:

"If it doesn't load through curl, it's broken." --someone
So, so true. Thanks, curl.

That's pretty much my own test for a Web based API - if I can drive it from the command line using curl then great, if I can't then it's broken.

I wasn't saying not to do the fancy stuff but rather to start with something which degrades well and then have your JavaScript enhance that basic experience. If you want to know why this is a good idea, you should start using something like getsentry.com or errorception.com to record your JavaScript errors.

I've been using websites since the early 90s and this pro-single-page sentiment is getting really tiresome. You are breaking the web. You are destroying users' security. Sure, there are plenty of reasons to use JavaScript, and plenty of places where it's appropriate. It probably is a good idea for games and so forth. But requiring users to load and execute constantly-changing code from across the web in order to read a page or submit a form is in-friggin-sane.

Some one else pointed out that it'd be nice if browsers offered more support for things that certain types of developers clearly want to do. I completely agree; it'd definitely be nice to take advantage of many of the technologies which currently exist to do more, in a more structured way. But requiring code execution in order to read data is madness.

http://chase-seibert.github.io/blog/2014/05/30/rich-client-side-web-apps-gone-too-far.html

https://news.ycombinator.com/item?id=9282744

April 2015

I tried to read a story at usatoday.com, using the Firefox browser with the NoScripts plug-in. Even with everything temporarily enabled for the page/site, the site functions horribly. It's an appalling UI/UX. Wow. People get paid to produce web-abusive sites. A 1995-designed site would also be superior to this usatoday.com train wreck. Amazing.

HN Thread : Please stop making infinite scrolling websites
https://news.ycombinator.com/item?id=9416017

I'm not a fan of the infinite scroll. Depending upon its implementation, it provides a clunky and confusing user experience, and the back button can be broken because clicking away and then going back may place the user at the top of the site. That's annoying after scrolling down several "pages."

Some services make more sense to use infinite scrolling, but I think that most of the time, the simple "Older" and "Newer" links work fine.

More web abuse.

https://news.ycombinator.com/item?id=9455315

Random question: How do I stop videos from auto starting on Bloomberg? I'm running Safari with no Flash, and have Ad-block on. Video doesn't start, but audio does. Super Annoying.

It's another web trend I don't understand. All news websites seem to do it, it's super obnoxious, and I don't know anybody who doesn't just rush to click stop as soon as they click the page.

It's amazing and tremendously annoying how many abusive websites launch a new browser tab when I click a link. Frigging morons. If I wanted to launch the page under the link in a new tab, then I would right click on the laptop or open in the background on the phone.

It's equally annoying and maybe worse when websites DISABLE the right click or open in the background option. These sites with their bloated, silly-ass JavaScript implementation are Grade-A web abusers.

May 12, 2015

Another wretched, web-abusive pile of steaming crap:
http://m.toledonewsnow.com/toledonewsnow/index.htm

On the phone, can't open an article page in the background with Safari.

After done reading an article and hitting the back button, the site places me back at the top of the site which is infuriating after I had scrolled down a long ways to read the article.

Unbelievable how people pay for this kind of development. Do they test it? How is this acceptable?

Revolting. Sites like this are harmful to the web.

http://www.quirksmode.org/blog/archives/2015/05/web_vs_native_l.html

July 2015

http://www.mondaynote.com/2015/07/13/news-sites-are-fatter-and-slower-than-ever/

http://arnoldit.com/wordpress/2015/07/14/page-load-speed-let-us-blame-those-in-suits/

http://adamsilver.io/articles/the-disadvantages-of-single-page-applications/

https://news.ycombinator.com/item?id=9879685

http://blog.venanti.us/web-app-2015/

https://news.ycombinator.com/item?id=9865338

https://www.designernews.co/stories/52124-web-design-trends-that-ruin-the-user-experience

https://stratechery.com/2015/why-web-pages-suck/ - https://news.ycombinator.com/item?id=9891927

I think there's too much blame being placed on programmatic advertising. That's no excuse for 14MB pages, fixed position ads, trackers pinging the network for a full minute, etc.

John Gruber had strong words about Apple news site iMore:

I love iMore. I think they’re the best staff covering Apple today, and their content is great. But count me in with Nick Heer — their website is shit-ass. Rene Ritchie’s response acknowledges the problem, but a web page like that — Rene’s 537-word all-text response — should not weigh 14 MB.1.
It’s not just the download size, long initial page load time, and the ads that cover valuable screen real estate as fixed elements. The fact that these JavaScript trackers hit the network for a full-minute after the page has completed loaded is downright criminal. Advertising should have minimal effect on page load times and device battery life. Advertising should be respectful of the user’s time, attention, and battery life. The industry has gluttonously gone the other way. iMore is not the exception — they’re the norm. 10+ MB page sizes, minute-long network access, third-party networks tracking you across unrelated websites — those things are all par for the course today, even when serving pages to mobile devices. Even on a site like iMore, staffed by good people who truly have deep respect for their readers.

http://pxlnv.com/linklog/safari-content-blockers-shit-ass-websites/

https://news.ycombinator.com/item?id=9897306 http://developer.telerik.com/featured/the-webs-cruft-problem/

http://www.mondaynote.com/2015/07/20/20-home-pages-500-trackers-loaded-%E2%80%A8media-succumbs-to-monitoring-frenzy/

This is hilarious and extremely ironic: http://www.theverge.com/2015/7/20/9002721/the-mobile-web-sucks

q.
I hate browsing the web on my phone. I do it all the time, of course — we all do. But man, the web browsers on phones are terrible. They are an abomination of bad user experience, poor performance, and overall disdain for the open web that kicked off the modern tech revolution.
q..

I disagree. The problem is not with the mobile web browsers. The problem is with the WEB SITES.

Web sites are an "abomination of bad user experience, poor performance, and overall disdain for the open web."

And the Verge.com is one example of a web-abusive site. It's home page is horribly slow-loading thanks to way too many useless images and probably javascript. With javascript disabled, the site's home page loads significantly faster.

These bloated websites require users to have brand new computers with the latest, fastest CPUs.

q.
Mobile Safari on my iPhone 6 Plus is a slow, buggy, crashy affair, starved for the phone's paltry 1GB of memory and unable to rotate from portrait to landscape without suffering an emotional crisis.
q..

I've never had remotely close to those problems in the 12 months that I've been using my iPhone 5C. I'm still using iOS 7.

q.
Chrome on my various Android devices feels entirely outclassed at times, a country mouse lost in the big city, waiting to be mugged by the first remnant ad with a redirect loop and something to prove.
q..

Um, okay. This person is not a writer.

And I've not had an issues with the Chrome browser on my iPhone. I like the fact that the browser has defaulted to the fast, smooth scroll when viewing websites. Maybe this will be the default for all mobile browsers someday. Then we'll have no need to design a website with the special CSS to make fast, smooth scroll occur. That CSS munges up other functionality within a mobile browser, like having the top and bottom sections of the browser disappear or shrink when scrolling.

Granted, this is only theverge.com, and maybe that's why this article lacks intelligent thinking.

q.
The overall state of the mobile web is so bad that tech companies have convinced media companies to publish on alternative platforms designed for better performance on phones.
q..

It's not because of poor mobile browsers and poor phone hardware. It's because of horribly designed websites by media orgs.

So typical. A media company blames someone else.

Way down in that lengthy article, the writer finally states something intelligent.

q.
Now, I happen to work at a media company, and I happen to run a website that can be bloated and slow. Some of this is our fault: The Verge is ultra-complicated, we have huge images, and we serve ads from our own direct sales and a variety of programmatic networks. Our video player is annoying.

We could do a lot of things to make our site load faster, and we're doing them.
q..

Finally, admitting, in a round-about, back-handed way, that it's the media company's fault. And I would say it's 100 percent the media company's fault.

Yet ...

q.
But we can't fix the performance of Mobile Safari.
q..

The writer or theverge.com should design that article page with bare-minimum html, 1995-style, and then load it as a static page and test the load speed on mobile Safari.

Add a meta tag with the viewpoint attribute to make the page read better on the phone. And then add a tiny CSS page with a little formatting and maybe a font-family load and a media query. But keep it focused on something useful.

And test that page load time.

Oh, no JavaScript. Don't need it for a user who is only reading the page.

Jul 21, 2015 tweet
https://twitter.com/richardpenner/status/623498582805512192

The Verge article blaming browsers for a shitty mobile web is 6MB and has more than 1,000 javascript errors.

In my opinion, 100 percent of the blame goes to website developers for creating messes. It's not the fault of the mobile web nor mobile web browsers.

This page loads fine and reads okay on the mobile web. It would read better if the viewpoint attribute was used within a meta tag to make the text display larger on a mobile browser. But as it is, it's still very readable when holding the phone in portrait mode, and the text is larger when the phone is held in landscape mode.
http://motherfuckingwebsite.com

The default font size for the above site when holding the phone in portrait mode is about the same size as many NEW responsively-designed websites that for some idiotic reason use a tiny font size.

It's amazing that RWD websites suck on the mobile web. Not everyone has perfect vision. Not everyone wants to read a ton of tiny text on each line. We don't mind vertical scrolling. Make the font size bigger.

I don't know why many website developers/designers use a smaller font size in their media queries as the screen size gets smaller.

In most cases, I use the same font size for a 320 pixel wide screen as I do for a screen that's larger than 1024 pixels. And sometimes, I increase the font size as the screen gets smaller.

For http://babyutoledo.com/ I use a larger font size than I normally use for both desktop and phone. Sometimes, I think the font size is too big, but on the phone, I like it. Good line-height, good spacing between paragraphs. It's a comfortable reading experience on my iPhone 5c when holding the phone in portrait mode.

It seems that for many web developers, "user comfort" is unimportant.

https://medium.com/@sophie_paxtonUX/stop-gratuitous-ui-animation-9ece9aa9eb97#.2mnswhkkm

https://stratechery.com/2015/why-web-pages-suck/
https://news.ycombinator.com/item?id=9891927

http://blog.venanti.us/web-app-2015/
https://news.ycombinator.com/item?id=9865338

https://www.filamentgroup.com/lab/weight-wait.html

http://developer.telerik.com/featured/whats-wrong-with-the-web/

http://product.voxmedia.com/2015/7/22/9013731/we-design-websites-not-mobile-sites

http://digiday.com/publishers/gq-com-cut-page-load-time-80-percent/

http://www.web-crunch.com/stop-it/

Gee, what a shock that bloated, slow-loading websites that get trimmed end up loading faster. It's not only annoying ads. Simply disable JavaScript, which reduces the desired function of the site, but such an action increases the speed of the site.

"Tests of top 50 news sites with three ad-blockers on iPhone show significant decrease in load times for many sites, modest increase in battery life"

http://mediagazer.com/151001/p7#a151001p7

http://www.nytimes.com/2015/10/01/technology/personaltech/ad-blockers-mobile-iphone-browsers.html?_r=0

What's sad and somewhat bizarre is that people are surprised at the page load speed of a single article page when ads and JavaScript are disabled.

Better late than never in discovering their sites' UX problem.

http://blog.chriszacharias.com/page-weight-matters

https://news.ycombinator.com/item?id=10393930

And I didn't write this HN comment, posted on Oct 15, 2015:

If you have an engineering mind and care about such things - you care about complexity. Even if you don't - user experience matters to everyone.

Have you ever seen something completely insane and everyone around doesn't seem to recognize how awful it really is. That is the web of today. 60-80 requests? 1MB+ single pages?

Your functionality, I don't care if its Facebook - does not need that much. It is not necessary. When broadband came on the scene, everyone started to ignore it, just like GBs of memory made people forget about conservation.

The fact that there isn't a daily drumbeat about how bloated, how needlessly complex, how ridicuous most of the world's web appliactions of today really are - baffles me.

I disagree with this HN comment:

The real problem of web development is JavaScript. It’s a relic of the past that hasn’t caught up with times and we end up with half-assed hacks that don’t address the real problem. We need something faster and way more elegant.

I'm writing this from within the JavaScript editor that I borrowed in the summer of 2013 and then hacked to meet my requirements. I have installed versions of this editor in my Junco (powering this site), Grebe, Scaup, and Veery web publishing apps. I can use it easily on my phone. I write fast with it. It works for me and my web writing purposes. For this, I LOVE JavaScript.

JavaScript is not the real problem with web development. It's overuse by designers and developers is the problem. It's used when it's not really necessary, in my opinion, and that bogs down page load times.

http://toledowinter.com uses no JavaScript for the browsing user. When I log in, and when I want to edit with what I call the "enhanced" editor, then I get JavaScript.

JavaScript can be perfectly fine for the logged-in user's tools and dashboard.

http://www.npr.org/sections/thisisnpr/2015/10/27/451147757/npr-org-now-twice-as-fast

https://www.designernews.co/stories/58223-nprorg-now-twice-as-fast

Designer News comments:

Main reason it is faster than most news sites, beyond all this excellent work: they not beholden to 3rd-party advertising tech.

another comment:

The async javascript loading is something I still haven't tried yet, but I'm sure would really help the sites we're building.

From the article:

Already one of the fastest websites in the news industry, NPR.org now loads twice as fast as it did previously, furthering public radio's commitment to mobile audiences.

  1. Load as much JavaScript asynchronously as possible. Most CSS, and all synchronously-loaded JavaScript, needs to be loaded and interpreted by the browser before a website finishes loading. JavaScript assets that don't affect the initial rendering of your website can be loaded asynchronously by using the async attribute or a JavaScript file loader like RequireJS.

  2. Optimize image assets. When developing a responsive website, it's important to be mindful of how much of your users' bandwidth your site will require to load, and it's not unusual for images to be among the heaviest assets on your site. [what? get outta here.]

  3. Measure constantly. There are lots of tools available to developers to help identify areas ripe for performance improvement, including PageSpeed and YSlow, and they're tremendously useful.

  4. Take testing seriously. Developers wrote unit tests as they worked, and the full NPR.org team began 15-minute, at-your-desk testing sessions a month before launch. In the final two weeks, the team gathered for highly structured, extended sessions. We held separate sessions for mobile and desktop testing.

http://digiday.com/publishers/washington-post-cut-page-load-time-85-percent/

https://news.ycombinator.com/item?id=10593276

http://www.wired.com/2015/11/i-turned-off-javascript-for-a-whole-week-and-it-was-glorious/

http://tantek.com/2015/069/t1/js-dr-javascript-required-dead

http://idlewords.com/talks/website_obesity.htm

https://medium.com/@wob/the-sad-state-of-web-development-1603a861d29f#.ifrgio664

"Really all I’m saying is don’t build a SPA. A SPA will lock you into a framework that has the shelf life of a hamster dump. When you think you need a SPA, just stop thinking."

http://joneisen.me/2016/01/13/great-state-of-web-development.html

The world wants Single Page Apps (SPAs), meaning we have to move huge amounts of logic from the server to the browser. We’ve been doing this for years, but in 2015, we’ve found better ways to build these large sprawling front end apps.

Eewww. Maybe the world wants native apps. Why not simply build native apps?

Are these SPAs used for internal web apps at companies to handle perform tasks by logged-in users? If so, then okey-dokey.

"... we’ve found better ways to build these large sprawling front end apps."

Great. Saddle users' devices with large, sprawling front-end apps. If these piles of steaming poop are used to display text-based content to non-logged-in users, then why?

If the user-experience is improved, then the SPA is a success.

If the user-experience is diminished by a bloated, sluggish, clunky web site, then the SPA is a massive failure. Go back to 1995.

https://www.baldurbjarnason.com/notes/media-websites-vs-facebook/ - Fantastic post - i stumbled upon this on Jan 22, 2016.

“The web doesn’t suck. Your websites suck. All of your websites suck. You destroy basic usability by hijacking the scrollbar. You take native functionality (scrolling, selection, links, loading) that is fast and efficient and you rewrite it with ‘cutting edge’ javascript toolkits and frameworks so that it is slow and buggy and broken. You balloon your websites with megabytes of cruft. You ignore best practices. You take something that works and is complementary to your business and turn it into a liability.”

—Facebook and the media: united, they attack the web

Apparently, the web is broken and slow despite the fact that the apps are using the same infrastructure and standards as the web. Guess how those Instant Articles are formatted? HTML. Guess how those articles get to the app? HTTP.

Those two techie terms should sound familiar to you.

Even the web’s old guard is worried. The web can’t compete. The web can’t compete. The web can’t compete. End times.

http://daringfireball.net/2015/05/facebook_instant_articles

https://500ish.com/facebook-instant-karma-4a4bd4f3eca?gi=5cf41ca63561

baldurbjarnason.com references the above links and then writes:

There’s just one problem with this. It’s completely untrue. Here’s an absolute fact that all of these reporters, columnists, and media pundits need to get into their heads:

The web doesn’t suck. Your websites suck.

The web is slow???? No. Wrong. Websites seem slow because publishers force users to download megabytes of useless crap. HTTP is not slow. Simple pages load almost instantly.

The lousy performance of your websites becomes a defensive moat around Facebook.

Of course, Facebook might still win even if you all had awesome websites, but you can’t even begin to compete with it until you fix the foundation of your business.

If your web developers are telling you that a website delivering hypertext and images can’t be just as fast as a native app (albeit behaving in different ways) then you should fire them.

Peter-Paul Koch, web browser tester extraordinaire, picks up on the phrase that I highlighted in the John Gruber quote and runs with it.

The web definitely has a speed problem due to over-design and the junkyard of tools people feel they have to include on every single web page. However, I don’t agree that the web has an inherent slowness. The articles for the new Facebook feature will be sent over exactly the same connection as web pages. However, the web versions of the articles have an extra layer of cruft attached to them, and that’s what makes the web slow to load. The speed problem is not inherent to the web; it’s a consequence of what passes for modern web development. Remove the cruft and we can compete again.
Tools don’t solve the web’s problems, they ARE the problem by Peter-Paul Koch (841 words).

The web is capable of impressive performance and offers a dramatically vaster reach and accessibility than anything an app can offer.

This is a long-standing debate. Except it’s only long-standing among web developers. Columnists, managers, pundits, and journalists seem to have no interest in understanding the technical foundation of their livelihoods. Instead they are content with assuming that Facebook can somehow magically render HTML over HTTP faster than anybody else and there is nothing anybody can do to make their crap scroll-jacking websites faster. They buy into the myth that the web is incapable of delivering on its core capabilities: delivering hypertext and images quickly to a diverse and connected readership.

We continue to have this problem because your web developers are treating the web like an app platform when your very business hinges on it being a quick, lightweight media platform with a worldwide reach.

Paraphrasing:

The mobile web doesn’t suck. Mobile web browsers don't suck. Your websites suck.

Build websites. Don't try to build native app sites. If you want native app functionality, then build a native app. Quit trying to make websites act like native apps.

http://bradfrost.com/blog/post/this-is-an-updated-website/ - Posted on 11.06.12 -

Oh, and Javascript on the site? There is none. That will probably change, but I think that’s pretty crazy.

Feb 2016

http://mattgemmell.com/the-reader-hostile-web/

found this older post:
https://timkadlec.com/2015/05/choosing-performance/

http://daringfireball.net/linked/2016/02/10/google-amp

One of the best posts that I have seen. At least a few of us think this way.
https://eev.ee/blog/2016/03/06/maybe-we-could-tone-down-the-javascript/

"Do web developers actually use web browsers?"

And thankfully, another person recognizes the wretchedness in Twitter's web design. It has to be the worst web design for a company with such a high Wall Street value. But most Twitter users probably access the service via a desktop/laptop app or a mobile app. Maybe the web version of Twitter is designed to be incredibly clunky to encourage more people to download an app.

About Twitter:

See, here is a screenshot of a tweet, with all of the parts that do not work without JavaScript highlighted in red.

That × button at the top right, and all the empty surrounding space? All they do is take you to my profile, which is shown in a skeletal form behind the tweet. They could just as well be regular links, like the “previous” and “next” links on the sides. But they’re not, so they don’t work without JavaScript.

That little graph button, for analytics? All it does is load another page in a faux popup with an iframe. It could just as well be a regular link that gets turned into a popup by script. But it’s not, so it doesn’t work without JavaScript.

The text box? Surely, that’s just a text box. But if you click in it before the JavaScript runs, the box is still awkwardly populated with “Reply to @eevee”. And when the script does run, it erases anything you’ve typed and replaces it with “Reply to @eevee” again, except now the “@eevee” is blue instead of gray.

That happens on Twitter’s search page, too, which is extra weird because there’s no text in the search box! If you start typing before scripts have finished running, they’ll just erase whatever you typed. Not even to replace it with homegrown placeholder text or apply custom styling. For no apparent reason at all.

About other sites:

I also use NoScript, so I’ve seen some other bizarre decisions leak through on sites I’ve visited for the first time. Blank white pages are common, of course. For quite a while, articles on Time’s site loaded perfectly fine without script, except that they wouldn’t scroll — the entire page had a overflow: hidden; that was removed by script for reasons I can’t begin to fathom. Vox articles also load fine, except that every image is preceded by an entire screen height’s worth of empty space. Some particularly bad enterprise sites are a mess of overlapping blocks of text; I guess they gave up on CSS and implemented their layout in JavaScript.

There’s no good reason for any of this. These aren’t cutting-edge interactive applications; they’re pages with text on them. We used to print those on paper, but as soon as we made the leap to computers, it became impossible to put words on a screen without executing several megabytes of custom junk?

I can almost hear the Hacker News comments now, about what a luddite I am for not thinking five paragraphs of static text need to be infested with a thousand lines of script. Well, let me say proactively: fuck all y’all. I think the Web is great, I think interactive dynamic stuff is great, and I think the progress we’ve made in the last decade is great. I also think it’s great that the Web is and always has been inherently customizable by users, and that I can use an extension that lets me decide ahead of time what an arbitrary site can run on my computer.

Yes, when I log into a web service, I expect to encounter an elegantly-designed, dynamic web interface that has been designed and developed by extremely talented people.

In my opinion, that happens when I log into my Digital Ocean account. The JavaScript helps make the experience smooth and easy. The JavaScript is not used to show-off. The JavaScript seems to be a background experience. The experience is smooth and maybe unnoticeable, which is even better. I log in, perform a task or two, and then exit. I'm not looking to be wowed by fancy tech.

The JavaScript should act like an offensive lineman that's doing a great job of run-blocking and protecting the quarterback, and the lineman goes unnoticed by fans and media.

When the JavaScript is misused in a show-offy fashion, then it becomes an obvious, annoying foreground experience. It becomes the offensive lineman who gets noticed by committing penalties and failing to block a rusher that crushes the QB.

More from eev.ee:

I’m not saying that genuine web apps like Google Maps shouldn’t exist — although even Google Maps had a script-free fallback for many years, until the current WebGL version! I’m saying that something has gone very wrong when basic features that already work in plain HTML suddenly no longer work without JavaScript. 40MB of JavaScript, in fact, according to about:memory — that’s live data, not download size. That might not sound like a lot (for a page dedicated to showing a 140-character message?), but it’s not uncommon for me to accumulate a dozen open Twitter tabs, and now I have half a gig dedicated solely to, at worst, 6KB of text.

Reinventing the square wheel

You really have to go out of your way to do this. I mean, if you want a link, you just do label and you are done.

This is what people mean when they harp on about “semantics” — that there’s useful information to be gleaned.

If I may offer some advice

Accept that sometimes, or for some people, your JavaScript will not work. Put some thought into what that means. Err on the side of basing your work on existing HTML mechanisms whenever you can. Maybe one day a year, get your whole dev team to disable JavaScript and try using your site. Commence weeping.

If you’re going to override or reimplement something that already exists, do some research on what the existing thing does first. You cannot possibly craft a good replacement without understanding the original. Ask around. Hell, just try pressing / before deciding to make it a shortcut.

Remember that for all the power the web affords you, the control is still ultimately in end user’s hands. The web is not a video game console; act accordingly. Keep your stuff modular. Design proactively around likely or common customizations. Maybe scale it down a bit once you hit 40MB of loaded script per page.

https://www.igvita.com/2016/01/12/the-average-page-is-a-myth/

http://www.businessinsider.com/enders-analysis-ad-blocker-study-finds-ads-take-up-79-of-mobile-data-transfer-2016-3

"Ads on news sites gobble up as much as 79% of users' mobile data"

That creates a slow web browsing experience, but according to the verge.com writer from the summer of 2015, the blame belongs to mobile web browsers.

One of the reasons consumers download mobile ad blockers is the impact ads have on their data plans. A report released Wednesday from Enders Analysis appears to back up that claim — at least when it comes to a sample of news websites.

March 2016

http://instartlogic.github.io/rwdsummit/#slide21

According to the HTTP Archive, the average top 1,000 web page is 2,123 KB, compared to 626 KB in 2010.

HTTPArchive
Images: 1253 KB v. 372 KB
JS: 425 KB v. 103 KB
CSS: 64KB v. 24KB

https://www.igvita.com/2016/01/12/the-average-page-is-a-myth/

http://mrmrs.io/writing/2016/03/30/responsive-design/

Please stop building websites that aren't responsive. Please stop assuming that I want to use your site on a desktop. Please stop serving up a mobile view that doesn't have the content I want and forcing me to try and read your small text on a 'desktop version'.

Please stop breaking the internet.

http://www.webperformancetoday.com/2014/02/25/the-great-web-slowdown-infographic/

https://brave.com/blogpost_4.html - "Brave's Response to the NAA: A Better Deal for Publishers"

https://mobiforge.com/research-analysis/the-web-is-doom
https://news.ycombinator.com/item?id=11548816
http://www.wired.com/2016/04/average-webpage-now-size-original-doom/

http://blogs.harvard.edu/doc/2016/04/15/get-it-right-forbes/

http://thin.npr.org/
Yep, thin.npr.org is a great web design. Okay, it lacks the viewport to display nicer on a phone, but at least a reader can read the site in landscape mode, and the reader can zoom into the content to enlarge the text size. And the browser back buttons work properly with the site. Links are underlined. The thin site supports the open web better most sites. It uses minimal HTML. The pages are lightweight and fast-loading. A smidge of inline JavaScript exists: 4 lines with two lines being curly braces. No external JavaScript is loaded. No inline nor external CSS is used.

But the trend, it seems, is to go back to underlining links, at least within the body of an article. Navigation or menu links located at the top and bottom of the site may not be underlined, but the single-word nouns or verbs are normally obvious links. It's the links that exist within the body of an article that can be hard to notice if the text and link colors don't offer enough contrast.

Keep it simple and underline links with the body of an article.

http://brutalistwebsites.com/
https://news.ycombinator.com/item?id=11517491

Another HN commenter:

If everything's a link, then you hardly need special notation to call out the links from the non-links. It's not a conscious effort to make links more identifiable; it just didn't occur to the creator that you can do something with links other than underline them.

Ahh, but sometimes, the site contains regular text on its page, which means links must be underlined.

Another HN comment:

Just showed some of these to a generally non-tech-savy friend who said he didn't like them because they looked "too 90s." Personally I love them because they load fast, are easy to read, and don't require a knowledge of a bunch of different frameworks to write.

Another HN comment:

I have been fighting for years to get people used to "90s aesthetics."

It's even more important for web design. Give me simple HTML with a touch of css, and javascript only if it's absolutely necessary. I can think of hardly any websites that I would consider "beautiful" these days for exactly this reason.

PS: I'm not sure I would classify these sites as brutalist; perhaps 'utilitarian' or 'functional' would be better descriptors.

Wow. Right on. I did not post that comment, but it represents my thinking.

http://www.webdirections.org/blog/the-website-obesity-crisis/

HN comment:

You can make something that doesn't require tons of frameworks and loads fast while NOT looking like a relic of the days of Kazaa. The fact that so many developers are too lazy to do so does not mean we should throw the baby out with the bathwater and go back to times new roman black-on-white.

True. Worthwhile CSS and JavaScript are fine.

Back to the brutalist HN thread:

I despise the trend for content minimalism. Where once there was a headline and sub, there's often now a cute little tile with picture and trimmed headline needing 20x the space to show the same number of stories. bbc.co.uk

Yep. I despise that design look and function too. Wasted space. Bloated web page. All those images tinted with the titles over the images. Hideous.

HN comment:

But websites and buildings are something that real people actually use, so fuck off with your -isms. Make websites where content is readable and easy to navigate. Make buildings that are great to live and work in as opposed to those whose mockups look unique and stunning in "Architectural wankery monthly".

HN:

As an overt visual design paradigm, meh. But hallelujah to the idea of a page that just has content, without the trendily de rigeur fucktons of overblown css and pointless javascript that adds 0 and only serves to crash my crappy mobile browser.

http://brutalistwebsites.com/37signals.com/

37signals (Jason Fried)
Q: Why do you have a Brutalist Website?
A: I see it more as a letter, not a web site. It just happens to be on the web. Also, re: "In its ruggedness and lack of concern to look comfortable or easy"... For me it's all about comfort and ease - especially when it comes to reading. I don't consider our site rugged any more than a simple letter is rugged.

http://www.opera.com/blogs/desktop/2016/03/native-ad-blocking-feature-opera-for-computers/

If there were no bloated ads, some top websites would load up to 90% faster.

Proper terminology was used. Websites would load faster not that the web would be faster. Websites are the problem not the web.

http://www.wired.com/2015/11/i-turned-off-javascript-for-a-whole-week-and-it-was-glorious/

https://500ish.com/drinking-the-web-through-a-straw-75ab685a1763#.scp3zw637

And that doesn’t even speak to the arguably more important issue: page load time. Compared to an AMP page, trying to load a regular page of content on the web feels like trying to suck it in through a straw. A very tiny straw.

If anyone saw a regular page of web content side-by-side with an AMP’d page, there’s no question they’d choose to see the latter, every single time.

Because on the desktop we’re all used to seeing the absolute worst of the web. That is, ridiculous widgets, awful JavaScript load times, and, of course, ads galore. AMP stripped all of the crud away and just gave me unadulterated content. And gave it to me fast.

It was such a revelation. I wanted to view all web-based content this way. Not just on mobile, everywhere.

Welcome to the dark side of wanting faster, simpler websites.

https://www.filamentgroup.com/lab/delivering-responsibly.html

... page weight does matter. Access can be slow, expensive and prohibitive."

http://whatdoesmysitecost.com/

May 2016

http://daringfireball.net/linked/2016/05/09/nyt-podcasts

JavaScript has brought the web to the brink of ruin, but there’s no JavaScript in podcasting. Just an RSS feed and MP3 files.

I understand what he's saying, but I would qualify it more.

"The misuse of JavaScript has brought the web to the brink of ruin."

Who gets to define the misuse? Does the article page or site function fine without JavaScript? What does a thousand pounds of JavaScript files do for the single article page that's accessed by a browsing-only reader?

For a site that requires the user to login, then I expect the dashboard or whatever to employ JavaScript in an elegant manner.

Using JavaScript for useless extravagance is breaking the web.

http://digiday.com/publishers/new-york-magazine-penalizes-advertisers-data-hogging-ads/

https://medium.com/@pete/how-many-floppy-disks-do-you-need-to-fit-an-article-from-the-atlantic-8924a9e057ff#.bz9c2ueh2

https://adactio.com/journal/10675/amp
https://adactio.com/journal/10665

https://adactio.com/journal/10708
https://news.ycombinator.com/item?id=11770774
http://jothut.com/cgi-bin/junco.pl/blogpost/71250/25May2016/JavaScript-vs-No-JavaScript-War-of-May-2016

https://begriffs.com/posts/2016-05-28-rss-responsive-design.html?hn=1

https://news.ycombinator.com/item?id=11798646

top hn comment:

On a more serious note - RSS is the Great Web Leveller. It spites your fancy CSS hacks, it's disgusted by your insane javascript, and it will piss all over your "mobile-optimized" crap. No semantic markup == no party; because markup is for robots, and RSS parsers are very stubborn robots that can see through web-hipster bullshit like Superman through walls.

another HN comment well down in the thread:

Regarding reading news on the Kindle: I've noticed that the browser becomes much more responsive (read: usable) if one disables JavaScript altogether.

Sure, some websites will break, but most will work well enough to be able to read articles.

Disabling JavaScript is also my trick for actually being able to browse the internet on a phone these days. In Firefox for Android, you can install a plugin to toggle it on/off for when you need it.

A bit sad that you have to do this to get at decent experience, but what can you do...

That's me, except on desktop/laptop. The latest, greatest CPU is needed to read websites, especially those produced by media orgs.

Pages load incredibly fast when javascript is disabled. Any slowness is probably due to either Firefox or the NoScript add-on/plug-in.

#manifesto