Hacker Newsnew | past | comments | ask | show | jobs | submit | repple's commentslogin

Yeah, a very admirable project indeed.

Ultimately it’s up to the author to make that explicit choice. I think that AI does and will enhance writing and depth and breadth of analysis one could perform. But, to be trustworthy, people will need to either lay out all cards on the table and/or work on other ways to gain trust over time. Maybe people need to provide some context to communicate what model was used and in which ways. What % of final output is AI vs author. I mean, if I see 100% composed by human author stated somewhere then there’s my cue to at the very least learn a little about the author. Certainly more complexity and discernment for readers. Depressing? In some ways maybe; but I’m kind of optimistic. Imagine what Tolkien could worldbuild armed with AI.. but then it wouldn’t be Tolkien.


Significant AI smell in this write up. As a result, my current reflex is to immediately stop reading. Not judgement on the actual analysis and human effort which went in. It’s just that the other context is missing.


The author is from Turkey (where I’m also originally from).

Believe it or not, when you write a blog post in a different language, it really helps to use an LLM, even just to fix your grammar mistakes etc.

I assume that’s most likely what happened here too.


IMO it would make sense to add a disclaimer then, e.g. “I wrote this myself but had AI edit”

I have no problem with people using AI, especially to close a language gap.

If you disclose your usage I have a _lot_ more trust that effort has been put into the writing despite the usage


I do believe it, but for whatever it's worth (maybe not much!):

If the author is willing and able to write understandable English, I'd prefer to read their version (even if it's very imperfect) than the LLM-polished version.

Alternatively, I'll happily read an article that was written in the author's native language and then translated directly to English.

This one bothered me because it's pretty clearly neither of those things, and so it reads just like any other LLM-written/LLM-polished piece.

[edit: just realised 'willing and able' might sound snarky in some way! All I meant was to acknowledge that even if you can write in a second (or third, etc.) language, you might not want to]


I believe it


Honestly I'd rather read imperfect english


Here's what gave it away for me

> The remaining difference is noise, not a fundamental language gap. The real Rust advantage isn't raw speed -- it's pipeline ownership.


There’s an unmistakable rhythm beginning with first paragraph. The trigger was “Same problems, same Apple M4 Pro, real numbers.” in third for me.

I’m scarred to detect these things by my own AI usage.

https://en.wikipedia.org/wiki/Wikipedia:Signs_of_AI_writing


I didn't notice any signs of AI writing until seeing this comment and re-reading (though I did notice it on the second pass).

That said, I think this article demonstrates that focusing on whether or not an article used AI might be focusing on the wrong “problem.” I appreciate being sensitive to the "smell" (the number of low-effort, AI posts flying around these days has made me sensitive too), but personally, I found this article both (1) easy to read and (2) insightful. I think the number of AI-written content lacking (2) is the problem.


Your initial focus is to prioritize which content to consume.


I also seem to be developing an immune response to several slopisms. But the actual content is useful for outlining tradeoffs if you’re needing to make your Python code go faster.


I have the same issue now. It's especially annoying when it happens while reading a "serious" publication like a newspaper or long form magazine. Whether it was because an AI wrote it or "real" writers have spent so much time reading AI slop they've picked up the same style is kinda by the by. It all reads to me like SEO, which was the slop template that LLMs took their inspiration from, apparently. It just flattens language into the most exhausting version of it, where you need to try to subconsciously blank out all the unnecessary flourishes and weird hype phrases to try figure out what actually is trying to be said. I guess humans who learn to ignore it might to do better in this brave new world, but it's definitely annoying that humans are being forced to adapt to machines instead of the other way around.


Fwiw, I thought the article is full of great information and well researched. I think your reflex is holding you back.


So much "is real". It is ok to check your grammar, but this is slopabetes inducing.


I had the same reflex. I just can't read AI redacted stuff. I don't know why it just reads very hollow and 'icky'


I got the same sense, but nowadays I can't be sure whether a text is AI or the writer's style has absorbed LLM tropes.


“The numbers are real.” But the voice is not.


If we only applied the same reflex to software, even when 100% human programmed.


What is the point of your post? I find it increasingly tedious to read comments about alledged AI use under almost every post. It's like complaining that you didn't want to read the submission because you didn't like their font or website design.

I think almost everyone here agrees they don't want to read AI slop, but this submission clearly wasn't that as you admit yourself.


I don't think it should be conflated with auto generated AI slop. I see a lot of snippets which were clearly manually written. I'm assuming the author used AI in a supervised manner, to smooth out the writing process and improve coherency.


Their goal of moving compute to space combined with their capacity to launch tons of payload will make this look like a tiny blip.


What is the benefit of "moving compute to space"?


It's hard for an uprising of poor people to shut it off. It's the ideal place to run your CEO / President simulations.

I say this tongue in cheek, but in all seriousness, I can't really think of any other benefit, and I no longer have a lot of faith in the good sense of some of the people involved.


Elon makes a relatively good case in the Dwarkesh podcast. I recall it like this:

1) Energy infra is going to be seriously limited on the production side well, well below demand

2) energy engineering solar for space requires less materials than for gravity-based solar (!)

3) you cut out distribution network needs when you just launch stuff all per-pod in space

4) SpaceX thinks it can create a scalable vertically integrated production facility to turn raw materials into space datacenter pods, with the exception of chips.

As a business bet, this is predicated on 10,000x inference demand growth - if we have that, and SpaceX can get the integrated production rolling, and get Starship launching, then these will be actively utilized at scale.

Whether you are bullish on the whole plan should, I think come down to your take on those priors: 10kx growth, ability to manage supply chain and production, Starship outlook, and silicon access.

I'm not bearish on this after listening to the podcast; it has a very Elon-like returns distribution - if they're wrong on a lot of this, they'll probably have some moderately price-competitive datacenter facilities in space and a lot of built organizational knowhow while Brooklyn journalists dunk on them for spending all that effort to just replicate what we have on Earth. If they're right about most of this, they'll have an unreplicable head start, both due to years of experience, and due to the cheap launch they gambled on ten years ago, they'll have a nearly insurmountable moat.


Everything relating to a datacentre that you can do in space you can do more easily on earth, regardless of 10,000x inference growth or supply chain or production or starship or silicon. I just don't think you can be cost competitive with earth bound data centres if 'protected from the poors' isn't a selling point.

By the way, 10,000x inference growth would look like what happened with cryptocurrency mining - after a couple of years, you'd be needing to upgrade all your machines with ASICs and the market would be flooded with very cheap graphics cards. I doubt that upgrading space data centres would be fun.


Zoning is one area that’s better in space. And power density for solar is another.

I don’t get your mining analogy though - a non upgradable data center pod is either going to pay off its capital costs or it won’t. Once it has, any revenue is close to 100% profit. 10k demand increase is the opposite of mining dynamics: there you get a 10k supply increase that the price has to support, in combination with more efficient silicon. Here the demand drives revenue and earnings.

If there’s some crazy inflection point in chips then you’ll still have all the power infra in space - you can just like cut the old pod and hook up a new one: or more likely manufacturing economies of scale mean you probably just keep sending up new systems and put the old ones on work loads they can manage at market prices.


Zoning is one area that’s better in space.

Not really, though? The idea that Earth-based data centers need to be built in populated, developed areas is indeed dumb, yet it seems to be inexplicably baked into everyone's assumptions. In particular, the small discrete data centers that Musk wants to launch could go anywhere on Earth.

They could be powered by local PV arrays and batteries, they can be cooled by smaller radiators than they would need to use in space, and they could be networked via Starlink or something very much like it, just as they would need to be networked in space. There's nothing special about space, it just costs more to get there.

If he wants them to be out of reach of governments, why not put them on container ships in international waters? There are thousands at sea at any given time, and I'm sure their operators would be happy to rent them out.

Hell, put them on dirigibles that just drift around in international airspace for months at a time. Anywhere but space.

And power density for solar is another.

Does power density matter in terrestrial solar applications? If so, why? These things can and should be deployed in oceans, deserts, and trackless wastelands. Who cares how big the solar panels are?


The problem is that you need humans to run datacenters, and so that puts ceilings on how far away from humans you can put them without the humans no longer being willing to commute there.

And the cost of building all the infra to support humans living in an area that humans are not already populating is enormous.


Well, evidently you don't need humans to run datacenters, if we're talking about launching them into LEO!

Here's an idea, let's do this instead: we put them in the desert, or on boats or zeppelins or whatever, and we pretend they're in space. If anybody asks, those fuckers are in space, man. Computin' in the cosmos.


> you need humans to run datacenters.

As far as I can tell from random articles online, it seems that as a rule of thumb, you need about 6 humans +1.5 humans per megawatt - and that's just for running the datacenter part, different people maintain the power generation infrastructure. Now, if you have to house those people in space or fly them up whenever they have to do anything, that's going to destroy your budget.

If you want to assume a level of automation that makes that unnecessary, that's fine, but then you need to also assume that same level of automation in earth based data centers too, and everything that goes with that.


All questions/comments that I don't know enough to opine on.

But, power density in terrestrial I think we can do some math and reasoning:

First, oceans are WAYYY more hostile than space. Oxidation + salt water + .. I don't think it's even close there. I don't think they are comparable.

Deserts and trackless wastelands - I have some experience with sub-Saharan logistics; a couple of points -- I would not be surprised if actual deployment to trackless wastelands is more expensive than lift. Analysts estimate $55k-85k per ton under starship. (Elon estimates much lower; let's stick with low end of analyst numbers).

Trackless wastelands are really hard to get to. For instance, I've seen a fuel truck tipped over on its side in a river next to a small tow truck tipped on its side in a river next to a larger crane trying to rescue the original truck and the "rescue" truck in Southern Kenya -- by no means a trackless waste -- probably a week long ordeal, JUST for diesel delivery. This was in an area under former British rule with roads and stuff.

Second, trackless wastelands are really hard to find. There are people everywhere, man. And they like free metal, free power, etc.

If we imagine instead just deploying to West Texas, I think the square footage does add up. 40 foot container -> call it 16 racks. Nvidia estimates 600kw per rack in 2027 with Vera Rubin(!!JFC!!). So, 10MW of power per container. Let's imagine we magically found water in West Texas and have a PUE of 1.2, so 12MW. Solar panels are like 20 W/sq ft.

I got lazy; Claude tells me with 2.5x land needed for spacing, infra, etc, 6.5 peak sun hours, a couple of acres for storage, roughly 130 acres (0.2 sq miles) + 53 Tesla megapacks for storage per container.

I'll revise my above thoughts - there is NO WAY it's cheaper to do that in trackless wastes than space. I don't know about west Texas, but I don't think it's crazy to think that you might want to spend five years on engineering and production scaling instead of town and county and state and federal permitting.


Granted, some compelling points against the "trackless wasteland" plan. All of them sound pretty valid to me.

Oceans, though -- we know how to deal with saltwater environments, we've done that for a while now. A key point is that anything you send into space or install near saltwater isn't going to last long without either regular maintenance or high up-front expense. But in this case, the equipment only has to last a few years until it's obsolete anyway, and ~5% FIT is probably tolerable. So I maintain that it's doable.

One good thing about an ocean-based platform is that it makes the heat dissipation problem go away virtually for free.

None of the challenges of running a 10 MW container full of hardware go away in space (other than the threat of nomadic scavengers, I suppose.) Yes, space-based PV arrays are smaller and lighter... but that's it, big deal. In particular, the idea of getting rid of that much heat in space without the benefit of convection, conduction, acres of expensive radiators, or magic is beyond my ability to comprehend, much less address. Everything having to do with heat removal is much harder in space.

So, given that you aren't going put 10 MW worth of hardware in a single satellite anyway, it doesn't seem valid to compare such installations on an equal basis as you're doing here. The 130-acre site you mention doesn't replace one satellite, it would probably replace a thousand of them.

You get a lot of expensive redundant requirements when you split up the problem that way, as well. These requirements will eat up any savings you might get from space-based deployment. Instead of one communications link with expensive RF hardware, you now need a thousand. Likewise, it's cheaper to build one 10 MW power substation than a thousand independent 10 kW power management solutions. And remember, this is all to support a single shipping container worth of hardware.


>Elon makes a relatively good case in the Dwarkesh podcast.

Are we still going to pretend that the man who has gotten every single prediction wrong so far knows what he is talking about?


Are we still going to pretend that the man who has revolutionized at least 2 different industries doesn't know what he's talking about?


He didn't revolutionize shit. He just threw enough VC money and paid the right people enough to eventually make a product that sticks. It took Space X multitude of crashes, downright scamming their suppliers, and lots of turnover to do something that other companies did in a few years. And the Starship is just laughably stupid.

And Tesla's only success is because they were subsidized like crazy. Of course people are going to purchase cheap electric cars with no maintenance. If BYD was allowed to operate in US, Tesla would have been under ground long time ago.

But I get your sentiment though. You are so far down the conservatism rabbit hole and probably have some inappropriate thoughts towards children, so you have to defend Musk till you die because god forbid you admit to yourself that you are terrible human being.


He has never made a good case only coherent stories people believe.

He has been saying self driving cars are right around the corner 10+ years by using a staged video.

I will never forget this statement; _I don't know anything about EVs so when he talked I believed him. I don't know anything about rockets so when he talked I believed him. I most defiantly know about software development and when he opens his mouth I know he is lying."

Still don't get what people see in him. Deep down he is not a good person and will say anything to pump up his image and stock.

P.S. Number of his fans like to down vote people for calling him a bad person.


How is cooling though?


Yeah I wonder the same thing - I keep getting told heat management in space is hard, but nobody discusses this inre the data centers. My understanding is one cooling mechanism is to just shoot lasers out into space (is this sci fi?) - I guess in that case you could just send energy back to your solar rigs, depending on wavelengths. TLDR: no idea


The whole thing is pie in the sky same as landing people on Mars. It's cool but if you look into deeper it doesn't make much sense and it's extremely challenging and on top of it all expensive as hell.


I understand that in earth based data centers, 30-40% of the power is spent on cooling. That's in facilities that can cool using conduction and convection to the outside environment.

I don't have any experience in this area, but it seems like for every square meter of solar panel you need about half that in radiator area. And depending on your orbit, these are probably not static things just sitting there, they need to be orientated correctly to work and their correct orientations will change over time.

The worry for me is the level of human maintenance required. The ISS has probably the biggest solar array around, and they send humans out to perform maintainance and repair on it multiple times a year. A decent size data center would need an order of magnitude more solar and radiators than the ISS, and so presumably would need even more maintenance.


You forgot 5: SpaceX has a monopoly on deploying satellites to LEO, with practically unlimited room for growth, and far less red tape and obstacles than anywhere on Earth. Whatever R&D and operational costs this insane engineering feat might have are offset by their market advantage, and Musk's Elizabeth Holmes-ian capability to fund his projects, in addition to relying on his own personal wealth and all of his other companies combined.

The fact that this lunatic is polluting humanity's view into the universe mainly for enriching himself and his shareholders, and that everyone is playing along with this, is sickening.


Every one of those points is false or an outright lie, though.


> What is the benefit of "moving compute to space"?

I’ll bite. It’s cheaper and quicker to permit a launch than permit, zone and interconnect a datacenter. And solar panels in space don’t need glass cladding, which makes them cheaper to make and lift.

The downside is launch cost. But there is a breakeven between these factors that seems to have most of its error bars within Starship’s target. (By my math, around $35/kg.) So if Starship works, and all indications seem to show that it will, eventually, then that puts space-based data centers at cost parity with terrestrial ones within a decade. Which was, well, unexpected when I ran the numbers.

(The surprising finding when you run the numbers is launching the chips and solar panels isn’t the limiter, it’s launching the radiators. Which opens up whole new questions about at what scale it makes sense to stop sending those up the well.)


> It’s cheaper and quicker to permit a launch than permit, zone and interconnect a datacenter

There's plenty of empty land sufficiently far from cities and not being used for anything else and that shouldn't have permitting or zoning problems.

For interconnect do that via satellite.


> plenty of empty land sufficiently far from cities

Which means interconnect permitting.

> For interconnect do that via satellite

As in power.


Ah, I was unclear. I meant build in empty land far from cities where you also have room to put in enough solar panels and batteries to power the data center.


> where you also have room to put in enough solar panels and batteries to power the data center

Environmental reviews. (The further from civilization the higher the chances the Southern farting nuknuk or whatever nests in your nowhere.) And construction costs.


The capacity of a single datacenter would require thousands of launches to get the equipment into space. I don’t believe for a second that this would be easier in any way. Cooling and bandwidth are also completely unsolved for compute on a useful scale.


> capacity of a single datacenter would require thousands of launches to get the equipment into space

But that equipment starts generating compute as soon as it’s up. This dramatically increases the capital efficiency of the venture. (Though space launch is still ultimately capital intense. Lower rates go, the more attractive it becomes.)

> Cooling and bandwidth are also completely unsolved

Quite wrong. (Though I was surprised by this, too.) ISS-style radiators (14 kg/kW) require Starship’s most optimistic launch cadences to make economic. But sub 10 kg/kw, which is closer to ISS heritage than any of the newer stuff, lets $100/kg to LEO work under most circumstances. Drop it to 6 kg/kW and even Falcon 9 becomes viable for low costs of capital (<3%) and 4-year permitting and build times.

Bandwidth is a problem, but an engineering one. (And one Starlink is working on with laser backhaul.)


What about maintenance? I’d naively assume that’s the killer.


> What about maintenance?

Simply put, you don’t. Your DC is launched into its graveyard. If a chip burns out it burns out—maybe rack design is a bit more redundant to keep failures as independent as possible.

Maybe at some point repair is a valid optimization. But it’s not necessary for an MVP, namely, one that is competitive against 3 to 5-year terrestrial delays and sub-10% costs of capital for such projects. That’s what has surprised me.


It seems like that could change the math quite a bit, since you’d presumably be losing a lot of capacity to failures. I’d assume you would have a much higher failure rate in space, and component failure is already pretty common on earth.


That xAI fails faster, hopefully.



Not having to deal with having to defend in court why polluting an area where you built your datacenter and fucking it up for the residents there is actually better for all man kind.


Whale meat is so iron rich. Basically eating liver.

Now reindeer steak — that’s excellent!


iOS has been so bad at it; selecting text to copy and then find out the last one or two characters are missing :/


I wonder what proportion of traffic is AI model weights being served. They are huge files, lots of mostly duplicated + fine tuned versions, lots of interest to download the latest and greatest, lots of ML aficionados grabbing them.


I used Joplin for a few years and loved it. I quit evernote as soon as it lost all of my notes years ago.

Switched to Obsidian for faster startup time, which is at the top of my feature list for such apps. Joplin got worse over time with more notes. I considered Roam and Notion, but having to pay AND slow startup made no sense, although Notion features are quite nice.

Now thinking about adding Logseq to work with my Obsidian.

I also think DEVONthink is great closed source app for research especially where you can index and search all of your PDF collection and it will give you closest matching files and content with respect to your current file. Many other great features. But it’s Mac/iOS-only app, lacks of linux/windows support. Startup time is very slow. And UX for note taking is kind of unpleasant to work with. I really want to use it but I can’t for all of these reasons. It’s like an expensive car which you own but never want to drive.


How would you use Logseq together with Obsidian?


I like to drop anything interesting I find throughout my day into daily journal in Obsidian (using iOS “Share via..”) for later review.

It works fine, but:

1. I prefer how logseq displays each day as a timeline so I can review the last few days easily (in Obsidian you check each file for each day one by one)

2. I like that logseq operates more granular on block level (bullet points) as opposed to pages, so I could reference blocks instead of pages.

I think interlinking thoughts and noted on block/bullet level would be helpful in finding content or thoughts I came across in the past. In Obsidian, it’s only possible via searching, manual tagging, and manual content management, which seems like a waste of time. I want to eliminate the friction for inbound information.

Both of these apps actually suffer from the same issue — on iOS, if the app hasn’t been initialized recently, it won’t actually drop the content using “Share via..” widget, it will just open the note for today. Sometimes you have to do it twice.

In terms of configuration, the way it would work is, in Obsidian, you configure the daily journal to use the same directory and naming convention to match Logseq. They both read/write the same markdown files, so it works seamlessly.


I feel like this DIY project needs its own DIY to build a cheaper 32” e-ink screen.

Sarcasm aside, thanks for sharing! The largest display I’ve seen has been 11” until now.


Nice. I think you have potential to do great. You have a nice voice which is easy to listen to and you’re going at the right pace. Music volume is just right. Keep at it.


Thank you for the feedback,

I really appreciate it


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: