Brian Lovin
/
Hacker News
Daily Digest email

Get the top HN stories in your inbox every day.

jbverschoor

So basically you run an endless script to fetch https://www.tesla.com/sites/default/settings.php and hope that some day there will be a minor nginx config error which lets you download the php source instead of executing it.

This will happen some day, so invest 5 bucks per month to exploit Tesla at a certain point, so maybe you can be first in line for the Cybertruck :-)

rvnx

This seems to be a too sophisticated attack, sometimes simplicity is better: https://samcurry.net/cracking-my-windshield-and-earning-1000...

walrus01

Time to try naming your tesla "drop table vehicles;"

gdhdjdvr

Ah good old bobby tables :)

j-bos

Great read

undefined

[deleted]

grubby

this was such a great read, people like you make me want to learn more and more everyday

grumple

Pretty sure every site on IPv4 gets probed multiple times a day for common config leaks and other misconfigurations. Happens to all of mine.

jbverschoor

Yeah, but if a gitignore tells you where to look for, and it isn't even blocked by a WAF / rule, it makes an interesting target, esp. one of the largest companies out there.

You shouldn't even be able to execute settings.php

TechBro8615

It's a good sign there might be an exploitable file upload vulnerability, if you can find an endpoint that uploads files to a directory that's served by Apache with the same configurarion as the directory of the executable settings.php

c7DJTLrn

Finally, a compelling reason to use IPv6.

TechBro8615

This comment transported me back to 2010 or thereabouts when this happened to Facebook. I remember being surprised at the simplicity of the code and making a lot of jokes about "build a facebook clone" ads on freelance websites.

rbanffy

I am sure there are lots of automated scripts doing precisely that with pretty much every company that has a website.

I used to keep a hall of shame on my main site, because looking for "settings.php" or "global.asa" on a Zope site was just silly.

retrocryptid

Except that you'll find that error long before the cybertruck ships. Heck, you'll probably see the rebirth of NFTs and BTC over US$40000 before the cybertruck ships.

tomjakubowski

Interesting, the exclude file (actually, everything under .git/info) 403s, while .git/index is a 404.

- https://www.tesla.com/.git/info/exclude

- https://www.tesla.com/.git/index

README.txt 403s too. https://www.tesla.com/README.txt

edit: just going to add files I've found here:

- https://www.tesla.com/.editorconfig

- https://www.tesla.com/profiles/README.txt

bumblewax

Two space tabs, nice.

jahsome

Add a trailing slash to index and it 403s

retrocryptid

sigh

retrocryptid

really? five down-votes because I sighed?

If you're going to down-vote me, down-vote me because I mentioned Elon is a human being, with human flaws and human strengths and not the resurrection of Supply-Side-Jesus.

ericmcer

A companies marketing website and their actual products have little in common. I would be surprised if any engineers even work on the marketing website and blown away if it is co-located with something sensitive.

FormerBandmate

https://xkcd.com/932/

Ffs, a tech forum should be better than this

wink

No, it's a valid complaint - I've seen it at several companies that the development team were eager to present a professional website (so that anyone in the know looking would not find such embarrassing stuff and maybe scare off potential new hires or customers) but it ended up in the hands of the marketing department. To the degree that the infra was moved to a different domain so the wordpress install at "www.example.com" could never even remotely do anything with cookies at "example.net" - but yes, that might have been a tad paranoid ;)

I think the person you were replying to was not playing down the thing that happened, but explained exactly what the cartoon said. It not being important for the general public does mean it's not a probable fact.

FormerBandmate

I was agreeing with the person I replied to. Most of the rest of the thread is implying this affects self-driving somehow

ranman

I would judge a vendor or consulting firm based on their marketing website. Why wouldn't I judge a car maker?

anonym29

If you think .gitignore leaks too much info, you're going to love https://www.tesla.com/robots.txt

soneil

The start/stop at the bottom makes that look like it's come canned with a CMS and they've just tacked on what they needed to. It's 90% boilerplate.

chx

It's hardly a secret tesla.com is Drupal -- both that gitignore and the robots.txt shouts it quite loudly, to be fair. One of the larger Drupal agencies, Lullabot includes them in their clients list: https://www.lullabot.com/our-work and they are looking for a sr backend Drupal engineer https://www.tesla.com/careers/search/job/sr-software-enginee... which I would take if the company were not lead by Musk.

capableweb

Not to mention a lot of the subsequent requests when loading https://www.tesla.com/ contains the HTTP header+value "x-generator: Drupal 9 (https://www.drupal.org)"

So yeah, not exactly a secret.

Neil44

And the bumph at the top - crawlers run by Yahoo! and Google - lol

judge2020

It’s the default drupal robots.txt it seems. https://api.drupal.org/api/drupal/robots.txt/5.x

jongjong

If that's all the dirt that thousands of vengeful fired Twitter ex-employees could find, then Tesla must have excellent security.

bakugo

Yeah this screams complete and utter desperation. Like, I get that hating Elon is what all the cool kids at school are doing this month but do we really need this immature garbage on the front page of HN all day?

extheat

Yep, it seems like most of the posters here in this thread don’t do much software engineering from the looks of it. Or are being purposely obtuse here. There is no security vulnerability here in any of the links we’ve seen so far minus some unnecessarily deployed boilerplate. The gitignore file is not the same file your deployment tool uses when publishing a website. If there’s an API endpoint that is public opposed to some static asset, that would be a problem. Nothing we’ve seen here indicates that.

Hamuko

Well, I'd personally at least find some hilarity in being a Twitter engineer fired by one of those 10x Tesla engineers while they're publishing their .gitignore files via HTTPS (which probably means that their Nginx configuration is fucked).

prepend

This is not an issue and just means that their wwwroot probably comes from a repo. Anyone who judges an engineer who made this decision poorly is silly.

I’d say it’s closer to good thing than bad thing due to simplicity.

jongjong

It's barely a vulnerability. Many open source projects have theirs public. It might be a problem if the company's system was terrible and relied on security through obscurity; but maybe they don't care. The engineers who think it's a big deal may have tunnel vision. That can happen if you spend years in a very narrow area.

undefined

[deleted]

randomsearch

https://xkcd.com/932/

I look forward to meeting the Tesla engineers who work on their core tech and also their webpage.

jasonvorhe

People are just having some fun.

threatripper

This looks like a default file from a Drupal installation: https://api.drupal.org/api/drupal/robots.txt/7.x

m00x

Really doesn't leak much, and robot.txt is supposed to be accessible from the internet.

anonym29

Yes, it's meant to be public, but you need not disclose all of what is contained inside of it. I've been on many pentests where paths provided by robots.txt, that I wouldn't have obtained any other way, led to exploitable vulnerabilities.

For some reason, a considerable number of people don't seem to think twice about adding sensitive paths to robots.

hsbauauvhabzb

Robots.txt is a web standard, if it lists routes to actual sensitive data then hosing those sensitive paths is the issue, not robots.txt.

I regularly see bad pentesters fall for this.

slim

that's defense in depth, right ? /s

also sometimes what's in robots.txt becomes invisible to the corporation as well and abviously bugs creep in

cuteboy19

I would rather that the paths be secure themselves. Security by obscurity is not a good idea. Anyways there are not that combinations of paths even when you consider all the different cms defaults

teknopaul

Not the case here tho is it

marginalia_nu

Did an inventory based on my crawler data a while back.

Relatively common to find sensitive or embarassing links singled out in robots.txt

Especially in old large organizations, like universities.

slaymaker1907

Apparently Tesla is FOSS, see https://www.Tesla.com.

Ptchd

Where can I get the FSD (Fake Self Driving) source code?

anonym29

edited to hide my horrific lack of HN text formatting skills

ChrisClark

What makes it fake? Just today my car drove me from my house to the grocery store with no intervention.

tacker2000

Its just random cms bs. Nothing to hate elon about

reaperducer

If you think .gitignore leaks too much info, you're going to love https://www.tesla.com/robots.txt

I wonder if these are some of the same people that Musk brought in to refactor Twitter.

madmod

I found a bug in the tesla model 3 reservation system that allowed anyone to get a reservation for free. Reported it via hackerone (or maybe it was bugcrowd dont remember) and got told it was of no consequence and would be filtered out later or something. Got no bounty for hours of work.

I accidentally ordered my model 3 with a free reservation, not the one I actually paid for.

jonathanyc

Given that people are selling reservations for thousands of dollars, I think you deserved something for reporting the issue. But I suppose being a hardcore engineer means never having to say you're sorry.

revskill

So, should we just add .gitignore to .gitignore and problem solved ?

kadoban

You're joking of course, but that likely won't do anything useful.

If it's tracked, then ignore has no effect. If it's not tracked, then you might as well use .git/info/excludes which is pretty much the same thing but not tracked, or you can use a global excludes file, like ~/.gitignore is common (you have to configure git to point at it, iirc).

It _could_ make sense to ignore the .gitignore if some other tool is parsing and using that file, but that pattern is...troublesome so I hope not.

vbezhenar

~/.config/git/ignore

kadoban

Hm, did not know that had a default, thanks.

agumonkey

the classic https://news.ycombinator.com/item?id=31420268

> Git ignores .gitignore with .gitignore in .gitignore

manojlds

.gitignore to Dockerignore

(Partly joking)

alvis

No. You never checkout a site directly from git to begin with. You don't let other people know what files are ignored from git doesn't mean people cannot access them. :/

teknopaul

Nonsense.

Everyone uses git for source control, of course you check out a site with git.

All you are telling people with a .gitingore is what is _not_ available.

It means exactly that people can not access them if your site is a checkout, because they aren't there.

NateEag

Many of us have a build process that converts the contents of a checkout into a deployable site (a.k.a. "build artifact").

The build process can trivially skip .gitignore files (and all other files that are strictly for dev environments).

You then deploy the build artifact to production, with exactly the set of files which ought to be there.

jahsome

Nope.

Paths in .gitignore means git ignores them. Doesn't mean the file doesn't exist. It means it's not in source control.

An example is a .env file. It may very well be _required_ in many PHP or node projects but it's going to be ignored.

hankchinaski

I like the simplicity and pragmatism of using drupal. I wouldn’t work with it myself but it was probably the cheapest/fastest way to get a similar site up and running

throwaway6734

If you stick completely within the Drupal"standard path", it's a great way to get a site up and running. Once you step outside of that path it's an absolute misery

jpoesen

Dunking on a tech while using a throwaway account and not providing details on why you find it absolute misery... not very useful or trustworthy.

throwaway6734

I spent around 3 years working with Drupal 7 and and about a half a year with Drupal 8.

For D7:

* The frontend and backend are too tightly coupled.

* The views system was awful to design custom, complex queries for. Documentation was scarce

* No dependency management

* Lots of weird hacks to do standard things, like the Features module

* The hooks system can result in a lot of complex, unclear logic

I've since moved onto the python/js ecosystem and it's much easier to build sites in

rbanffy

Indeed. I'd use Plone, but it's overkill for a website like this.

behnamoh

Can someone explain why this is leaky and how it can be exploited by malicious actors?

anonym29

It's leaky because it's globally accessible and provides information that isn't otherwise readily apparent.

There is no guarantee that an exposed .gitignore (or other exposed files, like .htaccess, robots.txt, etc) will be exploitable, but they aid in the discovery process and may help adversaries uncover exploitable vulnerabilities they might have otherwise missed.

At the extreme, I've seen paths of backups of the production database listed in a publicly readable .gitignore, and that database backup was publicly accessible, too.

Most of the time, nothing sensitive is revealed, but defense in depth suggests it's better to not upload files like these to your web server unless they're being used by the webserver (like .htaccess) or crawlers (like robots.txt), and if you do, they ought to not be publicly readable (unless intended, like robots.txt), but even then, you'd want to make sure nothing sensitive is in any file like that which is publicly readable. Even if there's nothing sensitive in them now, there's no guarantee that nothing sensitive will ever be added.

oceanplexian

I'm gonna give my counter take. Information disclosure is something that the DevSecOps(tm) crowd spends a disproportionate amount of time on for little benefit. The number of security professionals who don't know how to code, but learned Nessus or CrowdStrike and criticize others is too damn high.

I had to work with a security team in a FAANG for several years. They were so high and mighty with their low sev vulnerabilities, but they never improved security, and refused to acknowledge recommendations from the engineers working on systems that needed to be rearchitected due to a fundamental problems with networking, security boundaries, root of trust, etc. Unsurprisingly, their "automated scanner" failed to catch something a SRE would have spotted in 5 minutes, and the place got owned in a very public and humiliating way.

When I see things like this it brings back memories of that security culture. Frankly I think Infosec is deeply broken and gawking over a wild .gitignore is a perfect example of that.

anonym29

I'm a professional red teamer at a FAANG company, for reference. There are plenty of times where I find several low severity vulnerabilities, none of which are exploitable alone, but which can be chained together to produce a functional exploit with real impact.

There's no guarantee any of your testers will find every issue, and there's no guarantee that a seemingly innocuous finding can't have a greater impact than might readily be apparent.

That said, there are a ton of charlatans in security exactly like you describe - folks who can't read code (let alone write it) who just know how to click "scan" on their GUI tools and export the report to a PDF. A lot orgs have a QA-level team running those automated scans, which get passed on to a penetration testing team, who have more experience, but a limited time window for testing, and then finally on to red teams, who, along with some appsec / product security folks who are embedded directly on product teams, tend to have the most expertise, and the most time to really dive deeply into a service or application.

Also, keep in mind that those gawking over this probably aren't security folks, and the competent security folks here may not be gawking at the file itself (or others) - just taking part in the discussion.

acdha

I work in .gov so I have a lot of experience with that kind of security “engineer” but I’d take a more moderate position. This stuff is super-easy to resolve so you should spend a couple of minutes closing it and then focus on more complex things, with the reason being that when something like log4j happens you aren’t making it so easy for attackers to know whether you’re vulnerable – passively telling them makes it easier to avoid things like WAF blocking rules which will block IPs which actively prove.

Fnoord

There's no need to minimize or explode this; We need to put this into proportion. An information leak by itself is nothing, but it must be reported and taken seriously (by default, it should be fixed).

I'm not disappointed this happens at tesla.com; I expect as much. But to many people, this is a top-notch brand. You don't expect this on google.com or nsa.gov or fbi.gov either, do you?

antod

Personally I'd not deploy these files. Although that would be more to do with not having to discuss it yet again with auditors or pentesters than it would for actual security.

shudza

It's not an arbitrary thing, and any kind of vulnerability (including this one) is potentially a step in a chained exploit. I wouldn't be suprised if we see a hack before Tesla fixes this. And yes, they will fix it because it's a security issue.

kadoban

It's a bit of an information leak, but probably not a particularly serious one. It just gives some information about what tech stack they're using, which isn't really public but also not that hard to find out, and maybe a bit about where an attacker would want to look for other sensitive stuff. Pretty minor really, on its own.

It is a bit embarrassing because most web servers (and deployment setups) shouldn't be publishing/serving dot files anyway (files with names beginning with dot). But it's not necessarily a problem as long as they have some protection to avoid the _really_ sensitive stuff leaking, it's just kind of funny.

rvnx

This shows that the teams in charge of code deployment have relatively weak quality control.

In practice, it means that if the gitignore file is leaked, that there is a substantial risk that they accidentally leak the .git folder someday.

The .git folder indirectly contains downloadable copies of the source-code of the website, which could very likely lead to credentials leak or compromised services.

Your life can depend on Tesla.com services.

Even if you are the pedestrian side.

extheat

What makes you think that there is some "substantial risk"? You seem to be mixing together git repos and site deployment rules. I don't see the big deal here with some CMS leftovers being deployed, but yes from a perspective of correctness this is not something that needs to be deployed.

mlindner

> This shows that the teams in charge of website code deployment have relatively weak quality control.

FTFY. Little of Tesla's software is whatever they're using on the website. That'd be like judging Apple OS software by their website source.

rvnx

This is customer control panel, which directly leads to car APIs behind that are using the same credentials.

On the same domain there is also the Tesla SSO.

It would be bad if this gets compromised as there would be direct impact in the physical world, not just a static landing somewhere.

drexlspivey

So basically everyone’s life is at risk because the .gitignore got leaked. That sounds reasonable.

bpodgursky

I'd be pretty surprised if the marketing / landing site was remotely connected to the user portal. Most companies have a marketing-friendly CMS for public content, disconnected from the actual customer-facing portal.

rvnx

Tesla.com seems to be more than marketing, at least customers can sign-in there to do cars operations,.

If you can grab credentials from there you can do quite some things already.

See https://www.teslaapi.io/authentication/oauth (and this is in the case you don't trick an employee).

But I agree, that normally at some point they would catch it.

diogenesjunior

what makes you think the tesla.com website is where they keep their real code lol?

bobthepanda

The gitignore explicitly called out where the sensitive settings file is, so presumably that makes it a lot easier to figure out where to start injecting bad code

Alupis

Sure, but this appears like some very standard directories for popular website CMS platforms like Drupal.

So, not very surprising and probably doesn't really tip anyone towards anything particularly special.

m00x

It's probably caused by an incorrect nginx configuration, which means other static files may be exposed.

Otherwise, it's not much of a leak.

diogenesjunior

you could theoretically social engineer until you find something to exploit

ie, if the file said to ignore "/site/adminpasswords.txt" then you could go to /site/adminpasswords.txt and reveal admin passwords. this is obviously a simple eli5 explanation but i hope it helps

however, i doubt the tesla.com website is where they keep any important code that relates to actual tesla software like we would see used in cars. that would be like the army having their real code for their software/systems at goarmy.com lol

mlindner

It's not really leaky and can't be exploited by anyone. It's an interesting curiosity at best.

undefined

[deleted]

djegod

Ask myself what other files will be exposed?

alvis

There is a `cron.php` lol

MH15

behind auth as of 4pm ET though

soheil

> sites//settings.php

Yes PHP is still relevant!

dpcan

Yeah. WordPress, Drupal, Joomla, Laravel, vanilla php. Together they power almost 45-50% of the web. So PHP is still extremely relevant. The most relevant you might be able to say.

soheil

I just don't understand why Apache dropped native php module support forcing everyone to deal with finicky cgi.

Daily Digest email

Get the top HN stories in your inbox every day.

Tesla.com/.gitignore - Hacker News