1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

curl for https support

Discussion in 'Tomato Firmware' started by fiwiman, Mar 4, 2018.

  1. fiwiman

    fiwiman Network Newbie Member

    Last edited: Mar 4, 2018
  2. koitsu

    koitsu Network Guru Member

    I don't think curl is truly *needed* for this as a program/library. The problem stems from a few things:

    1. The website continually gives HTTP URLs throughout their site, example code (for PHP), and so on -- yet if you actually try to use it, you'll find that they send back a Location: header redirecting you to the same URL but HTTPS. For an API, this is Generally Bad Practise(tm) because it makes a lot of bad assumptions about the underlying client capability, especially because HTTP -> HTTPS redirects have a lot of caveats/problems (many underlying HTTP libraries won't follow redirects of this nature). They should just update their site to exclusively use HTTPS URLs for their API and turn off/drop HTTP for the API.

    This marks the 2nd or 3rd time this particular website (macvendors.com) has "tweaked" something about their stuff that has caused breakage on TomatoUSB. It's really starting to tick me off. I'm also growing pretty tired of this HTTPS-Everywhere attitude.

    2. I added TLS SNI support for Busybox wget a few years ago and it was integrated. Here is my forum post which includes details of the whole event, as it required a lot of reverse-engineering. This got things working behind CloudFlare and a couple other places.

    2a. How Busybox (thus TomatoUSB) "does" SSL is not like how everything else does it. SSL capability on Busybox (thus TomatoUSB) usually involves fork/exec of a separate process, notably running openssl s_client with some unique flags/arguments. It's incredibly fragile. Everything else in the world uses the actual OpenSSL API through libssl in native C (which is literally the worst API I have ever used/worked with -- I'm not overdramatising this either, it really is awful, and you can read about the awfulness documented per the LibreSSL folks).

    2b. The OpenSSL version included with TomatoUSB tends to be "older" (it's version 1.02h, at least in Toastman), and is built with a large number of features/ciphers/etc. disabled due to firmware space limitations.

    3. In the past 2 years there has been a TREMENDOUS amount of focus on SSL in general, particularly deprecating old TLS versions and instilling a whole ton of "requirements" that, when not met by an SSL client, result in breakage. POODLE (circa 2014), Heartbleed and Cloudbleed (circa 2014) brought a lot of this stuff to a head. Sometimes that means errors (if, say, using the OpenSSL libssl library natively in C), sometimes it means silent breakage -- and even if you get an error, it won't necessarily make any sense (example here).

    You can't just install Entware-ng (or Optware) and have this "magically work" in TomatoUSB -- at least for GUI portions -- without some extra work (specifically a bind mount to "replace" /usr/bin/wget with /opt/bin/wget; but this has major risks, because GNU wget does not behave exactly like Busybox wget, so other parts of TomatoUSB could silently break or act weird for you). From the CLI / interactively, yes, it would work just fine, but IIRC there are parts of the TomatoUSB GUI (mainly under Device List, I think) that use this macvendors API directly.

    I will see if I can figure out "what" is breaking with this website and "how", and try to propose a fix. It's almost certainly related to #3, or possibly related to URL encoding. It takes time debugging and reverse-engineering every little bit. My point of this long diatribe is to demonstrate that on TomatoUSB and embedded devices in general (esp. where firmware size is a focus), it isn't as easy as "do this simple thing and it's fixed", it usually involved going down a deep rabbit hole.
    fiwiman likes this.
  3. fiwiman

    fiwiman Network Newbie Member

    Thanks for the explanation.

    As a temporary workaround, I've written a php page on a local web server using their new api and altered my script on tomato to hit that instead.
  4. koitsu

    koitsu Network Guru Member

    Okay, fetching data from this site's API via HTTPS works fine with Busybox wget.

    The issue I was running into, which was making me think it wasn't working, is a weird bug with Busybox wget. Saving to a local file works fine:

    root@gw:/tmp/home/root# /usr/bin/wget https://api.macvendors.com/fc:fb:fb:01:fa:21
    Connecting to api.macvendors.com (
    fc:fb:fb:01:fa:21    100% |***********************************************************************************|    18   0:00:00 ETA
    root@gw:/tmp/home/root# cat fc\:fb\:fb\:01\:fa\:21
    Cisco Systems, Incroot@gw:/tmp/home/root#
    Note however that there's a lack of newline in the output (after "Inc") -- that's the API doing that, not wget or something else.

    But when I tried to output to stdout instead of a file, I got nothing:

    root@gw:/tmp/home/root# /usr/bin/wget -O- https://api.macvendors.com/fc:fb:fb:01:fa:21
    Connecting to api.macvendors.com (
    -                    100% |***********************************************************************************|    18   0:00:00 ETA
    Checking with strace showed me that yes, wget was in fact outputting the result from the website to stdout (writing to fd 1 of the main process):

    root@gw:/tmp/home/root# strace -s 4096 -tt -f -- /usr/bin/wget -O - https://api.macvendors.com/fc:fb:fb:01:fa:21
    [pid 22349] 20:29:21.085441 read(0, "GET /fc:fb:fb:01:fa:21 HTTP/1.1\r\nHost: api.macvendors.com\r\nUser-Agent: Wget\r\nConnection: close\r\n\r\n", 8192) = 98
    [pid 22349] 20:29:21.169871 write(1, "HTTP/1.1 200 OK\r\nServer: nginx\r\nDate: Sun, 04 Mar 2018 20:29:45 GMT\r\nContent-Type: text/plain; charset=utf-8\r\nContent-Length: 18\r\nConnection: close\r\ncache-control: max-age=0, private, must-revalidate\r\nx-request-id: 6t3g7r5qdilj9835ignjevjpk3mufj84\r\n\r\nCisco Systems, Inc", 269 <unfinished ...>
    [pid 22348] 20:29:21.171720 write(1, "Cisco Systems, Inc", 18Cisco Systems, Inc) = 18
    Turns out that that the progress bar that Busybox spews to stderr actually seems to cause some output issues (or this could be the side effect of lack of newline in the content coming back from the server -- again, that's not wget's fault). If I output stderr to /dev/null, things work:

    root@gw:/tmp/home/root# /usr/bin/wget -O- https://api.macvendors.com/fc:fb:fb:01:fa:21 2>/dev/null
    Cisco Systems, Incroot@gw:/tmp/home/root#
    I could have used the -q flag as well, but I went for what I knew worked.

    This same bug happens with GNU wget, believe it or not -- but look very very very closely at the output and you'll see a "C" near the end of the 0% progress bar (that's the "C" from "Cisco"):

    root@gw:/tmp/home/root# /opt/bin/wget -O- https://api.macvendors.com/fc:fb:fb:01:fa:21
    --2018-03-04 20:32:11--  https://api.macvendors.com/fc:fb:fb:01:fa:21
    Resolving api.macvendors.com..., 2600:3c03:1::45a4:dfb5
    Connecting to api.macvendors.com||:443... connected.
    HTTP request sent, awaiting response... 200 OK
    Length: 18 [text/plain]
    Saving to: 'STDOUT'
    -                                  0%[                                                           ]       0  --.-KB/s               C-                                100%[==========================================================>]      18  --.-KB/s    in 0s
    2018-03-04 20:32:11 (714 KB/s) - written to stdout [18/18]
    In copy-pasting the above, it appears that the "C" is actually the last character (further right on my terminal) and there's no newline following it -- meaning all of that is "one big long line" on a terminal. That makes me think this problem is caused by the lack of newline in the response content from the server, and it just manifests badly/rudely with wget.

    Like Busybox wget, with GNU wget, either redirecting stderr (where the progress bar/status indicators go) to /dev/null, or using the -q or --quiet flags make things work again:

    root@gw:/tmp/home/root# /opt/bin/wget -q -O- https://api.macvendors.com/fc:fb:fb:01:fa:21
    Cisco Systems, Incroot@gw:/tmp/home/root#
    In short: you can use Busybox wget to accomplish the fetching of the API response from https://api.macvendors.com successfully. You do not need curl. You just need to be aware that there seems to be an uncomfortable quirk/bug relating to wget (both Busybox and GNU) with its progress status bar, so inhibiting that by using either the -q flag, or 2>/dev/null and you should get the results you want.
    fiwiman likes this.
  5. fiwiman

    fiwiman Network Newbie Member

    Thanks but getting:

    wget: error getting response: Connection reset by peer

    While using /usr/bin/wget on my router
  6. koitsu

    koitsu Network Guru Member

    Sorry, I cannot reproduce that problem here. I run tomato-RT-AC56U-9008.8Toastman-ARM-VPN-64K.trx (Toastman-ARM) on an RT-A56U.

    Shibby 140 includes support for TLS SNI in wget -- it was introduced in Shibby 138.

    The error message in question implies the remote server sent back an abrupt TCP RST and closed the socket abruptly or early; this can also sometimes manifest as "connection reset by peer" depending on the networking stack on the remote end. I think I know the answer. I have IPv6 disabled in TomatoUSB because IPv6 is a travesty and nightmare.

    api.macvendors.com DNS entries are:

    api.macvendors.com.     300     IN      A
    api.macvendors.com.     300     IN      AAAA    2600:3c03:1::45a4:dfb5
    You can see they advertise an IPv4 address and an IPv6 address. Well guess what: their IPv4 works, their IPv6 doesn't (their webserver isn't listening on IPv6).


    Disable IPv6 and the problem should go away for you. Otherwise, you can simply blame the people who run that website for being negligent. :)
    fiwiman likes this.
  7. ruggerof

    ruggerof Network Guru Member


    curl -s -k "https://api.macvendors.com/10:BF:48:89:F9:B8"
  8. eibgrad

    eibgrad Network Guru Member

    I don't recommend using curl for https w/o the -L option.
  9. fiwiman

    fiwiman Network Newbie Member

    I appreciate your help, wget works as per your instructions, turns out I was on 137, got mixed up with another router.
    koitsu likes this.

Share This Page