Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Enable CORS for Your Blog (blogsareback.com)
68 points by cdrnsf 10 hours ago | hide | past | favorite | 26 comments
 help



In uni I built a simple web scraper in JavaScript. It just ran in the browser. It would fetch a page, extract the links, then fetch those pages.

You could watch it run in realtime and it would build out a tree of nested links as it crawled. It was a lot of fun to watch! (More fun than CLI based crawlers for sure.)

The only issue I had was not being able to fetch pages from the front end due to CORS, so I just added a "proxy" to my server in 1 line of PHP. There, now it's secure? ;)


That's rater like "gnu.org" which blocks you when you're using a slightly older browser. But when you change your user agent to "curl" it magically starts working. Or the German news site "spiegel.de" which also blocks old browsers from accessing the site entirely, unless you change the user agent to "bingbot" (or some other random bot from their whitelist). *insert a facepalm emoji here*

Hey folks, I'm the developer working on Blogs Are Back. WakaTime has me clocked in at over 900 hours on this project so far...

If CORS weren't an issue, it could've been done in 1/10th of that time. But if that were the case, there would've already been tons of web-based RSS readers available.

Anyway, the goal of this project is to help foster interest in indie blogs and help a bit with discovery. Feel free to submit your blog if you'd like!

If anyone has any questions, I'd be happy to answer them.


> style="opacity:0;transform:translateY(20px)"

In my opinion, that’s a bigger problem than CORS. Proxyless web feed reader is a lost cause, you’re wasting your time because only a small minority are ever going to support it. But that opacity and transition nonsense gratuitously slows down page loading for everyone, and hides content completely for those that aren’t running JS.

(What I would also like to know is: how come this is the third time I’ve seen exactly this—each block of content having this exact style attribute—in the past month, when I don’t remember encountering exactly it before?)


The entire web app is JS based. It's a requirement I'm ok with.

And to answer your question, you're seeing that kind of styling so frequently because it's likely part of Framer Motion, an extremely popular animation library

https://www.npmjs.com/package/framer-motion https://www.npmjs.com/package/motion


Would also be great if the animations respected the `prefers-reduced-motion` setting, instead of forcing down animations that reduces accessibility.

I think cooler heads will agree that a middle ground where the content is available on the initial request is best. But what do I know /s

Hey! Blogs Are Back is cool! Nice to see more modern RSS readers, and also thematic blog collections. If you seek more curated blogs to share with your users, check out my project https://minifeed.net/


Hey, this is very interesting! As someone working on an extension that works as an ActivityPub client, I don't have to deal with CORS issues so much (most servers configure CORS properly, and the extension can bypass CORS issues anyway) but I just spent a good chunk of my weekend working on a proxy that could deal with Mastodon's "authorized fetch".

So, basically, any URI that I need to resolve goes tries first to fetch directly and it falls back to making the request through the proxy if I get any type of authentication error.


You need to put a screenshot of the app on your page.

How can someone add platforms to the guide? I want to add Caddy

It's really cool that you can simply get the full text from sites that refuse to offer the entire text in their RSS feed, without having to go to their site. However, there are a few things that don't work so well. When you add feeds from YouTube, the video is not embedded. Even if the feature is out of scope, it would be good if the title and a link to the video were displayed instead. Also Bluesky posts lacks the embedded content. Furthermore, a maximum of 100 feeds is clearly not enough. If you add things like YouTube, Reddit, Lemmy, Bluesky, etc. you will reach the limit very quickly. Even if these are not content that you actually read in the reader, it would be annoying to have two different RSS Apps just for that reason.

I have done this. I also relaxed my Cross-Origin-Embedder-Policy header - https://developer.mozilla.org/en-US/docs/Web/HTTP/Reference/...

Huh, that's a pretty interesting request. And it makes sense to me. I've enabled it on my RSS feed. I wanted to see if I could add my blog feed to it to test but when I went to do so I had to install a Chrome extension on your app to do it. All right, if someone wants my blog for whatever reason that badly, they can now do it.

I have noticed some sites block cross origin requests to their feeds. It’s annoying but I just use a server now so I don’t care. I very much recommend RSS readers to use a server as it means you get background fetch and never miss a story on sites with a lot of stories like HN.

From the linked post, I think the point of fetching it in-browser is so that your subscriptions stay private. Idk why this is desirable, but if people want it, it’s nice to give them the option.

This feels like such a weird ask?

Why would anyone do this, so their content can be easily read elsewhere potentially with a load of ads surrounding it?

This seems to really reason through only the happy path, ignoring bad actors, and there'll always be bad actors.


If a malicious website wanted to copy a blog's website to put ads on it, they already can just copy it outside of the browser on their end, which has the "benefit" of preventing the original blog from taking the post down.

CORS also doesn't prevent a popular website with a personal vendetta[0] against a blogger from DDOSing the blog with their visitors, since CORS doesn't block requests from being sent.

For a purely static website, there shouldn't be any risk from enabling CORS.

[0]: https://news.ycombinator.com/item?id=46624740


This seems to really reason through only the happy path, ignoring bad actors, and there'll always be bad actors.

True, but the bad actors can defeat any security mechanism you put in place with a proxy, or a copy'n'paste, so the downside risk is pointless worrying about. The upside of allowing traffic is that your content that you presumably want people to read can be read by more people. For all but the most popular blogs that's probably a net benefit.


To be fair, they do explain their motivation. It's an in-browser RSS reader, so it's fetching the RSS feed directly without a proxy server. There's not much risk since the content is public and non-credentialed. The bigger risk is misconfiguring CORS and inadvertently exposing other paths with the wildcard.

Also, why would an RSS reader be a website? An application installed on your PC is superior in every way.

I couldn't feel more strongly in the other direction. The fewer programs running on my computer, the better. By far my preference is that "random dev code" gets placed into the strongest possible sandbox, and that's the browser.

With a website you get shared state (these days many people are using multiple devices), platform independence and sandboxing for free. Plus custom CSS and tamper scripts for customization, browser addons, bookmarks, an API for other applications to consume the content, and probably more.

Um, no? the most popular RSS reader back when RSS readers were a thing was Google's. It was a website. And why not. Like other websites, you can log in from any device that has a browser and immediately pick up where you left off, including work machines where you aren't allowed to install native apps.

So, about that...That's how I read RSS feeds on my Kindle.

https://github.com/adhamsalama/simple-rss-reader




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: