Automic Workload Automation

  • 1.  Protecting the REST API

    Posted Jan 24, 2022 10:13 AM
    There is some vague chance that there is one person in this community which is as paranoid as I am.  With that poor single individual in mind I though I will  post a small workaround I did,  to help me sleep a little bit better at night ;) 

    The "problem"
    I wanted to expose the Automic REST API to a wider unrestricted network range ( part of company network, or maybe even the Internet :goosebumps: ) 

    The problem underneath the problem
    Automic does not let me filter in any way which user is allowed what. Not to even mention creating dedicated disposable tokens  with a defined usage scope. So once I let the network traffic in from external sources (not directly the AWI ) it means everyone can join the party to the extent their Automic permissions let them. 

    I do not like it. 

    The workaround
    As we are talking about HTTP(s) requests I went with the path of least resistance and used nginx as a proxy  exposing my REST API backend.  This allows me at least some minimal degree of control who is allowed what.

    Here a sample configuration giving me a few benefits:
    • access only to users whose name starts with API,
    • limit  access on a client basis
    • control which methods can be used

    server {
        listen      80;
    
        # restrict methods to valid ones
        if ($request_method !~ ^(GET|POST|PUT|DELETE)$) {
            return '405';
        }
    
       # restrict users
       if ($remote_user !~ ^API.*) {
      	return '403';
      }
    
       set $baseurl "https://your_ae_hostname:8088";
       resolver 1.1.1.1;
    
       # Proxy for ping
       location /ae/api/v1/ping {
            proxy_pass $baseurl/ae/api/v1/ping;
    	# Proxy timeouts
    	proxy_connect_timeout              60s;
    	proxy_send_timeout                 60s;
    	proxy_read_timeout                 60s;
        }
       # Allow only selected clients
       location ~ /ae/api/v1/(0|100|200)/ {
            proxy_pass $baseurl;
    	# Proxy timeouts
    	proxy_connect_timeout              60s;
    	proxy_send_timeout                 60s;
    	proxy_read_timeout                 60s;
        }
     }
    }​


    Based on that approach  some fine grain control is possible restricting certain users to certain methods, endpoints etc. The proxy is running plain http as this is just a container which gets https termination a little bit before, but it could of course serve HTTPS right away. 

    It may turn out a rhetorical question, but did any of you struggle with similar problem?

    How do you control who can do what using REST? :) 










    ------------------------------
    Cheers,
    Marcin
    ------------------------------


  • 2.  RE: Protecting the REST API

    Posted Jan 24, 2022 11:21 AM
    Edited by Pete Wirfs Jan 24, 2022 11:30 AM

    We are exploring exposing the RESTAPI to the internet so that cloud applications can kick off Automic requests.  One idea our team is experimenting with is that the firewall opening will only accept connection requests coming from a specific IP address, or the request will be blocked at the firewall.  I will be sharing your ideas with the team as well.  The more restrictions the better.

    We are also insisting that every app must use a unique service account, and that service account will only have just enough rights inside of the clent to do just its necessary assigned tasks.



    ------------------------------
    Pete Wirfs
    SAIF Corporation
    Salem Oregon USA
    ------------------------------



  • 3.  RE: Protecting the REST API

    Posted Jan 24, 2022 02:37 PM
    What I particualrily like in the proxy approach is that it lets me create a unlimited pool of dedicated urls for every app / use case. So this is as close as to a token approach as I could think of. 

    So for example should I need to onboard some external app it will get a unique url generated with a tailored config based on its needs. Should it get compromised / get decommissioned you can just invalidate the URL  :) 

    This makes service discovery also quite hard because you will need to know a combination of  random url, valid user and allowed endpoints to get some backend response.


     


    ------------------------------
    Cheers,
    Marcin
    ------------------------------



  • 4.  RE: Protecting the REST API

    Posted Jan 25, 2022 02:12 AM
    You basically need a WAF. nginx as a reverse proxy can partially meet these requirements. I totally like the way you implemented it (at least once you have it set to https :-)). It's easy, straightforward and you can also limit this to endpoints & methods.

    ------------------------------
    ☎️ Swisscom Automation Engineer & 🧙 PE Membership Creator

    Automic Kurse, Tutorials, Tools und mehr auf:
    https://membership.philippelmer.com/
    Zwei Wochen kostenlos testen!
    ------------------------------