Lauren was using her desk for art and computing. She sketched out a design and I quickly drew it in sketchup.
Desk Concept
We decided to use an existing top from Home Depot. I probably won’t do this again. The quality is really poor. I had a to gouge out lots of loose wood and we used lots of Cyanoacrylate (“CA”) glue and epoxy to fill in the voids and then used my Lie-Nielsen No. 60-1/2 Adjustable Mouth Block Plane to shave off the excess adhesive. The end result was a nice smooth table, but the wood quality was really poor.
Pocket Hole Calculations
I measured my Kreg pocket hole drill bit as 133 mm long and 9.5mm wide with a pilot hole that is 4 mm in diameter and 12mm long (please comment below if you know the factory dimensions). Getting the Kreg dimensions right is an interesting problem. You can some discussion and calculations at this link and folks have talked about this.
Kreg pocket hole calculations
Since I’ll be screwing pocket holes into the legs, I selected 1″ Long Square-Drive Flat Head Screws for Wood, Black-Oxide Steel, Number 8 Screw Size. I will be using glue as well. The key dimension is the 15 degree angle and the key parameter is the distance between the 75 degree offset plane (green above) and the start of the pilot hole. In this case, I needed zero.
To get this working in fusion 360, I wrote this script.
I generated engineering drawings for the desk drawer.
drawer
And also generated drawings for all the components.
Major Desk Components
I’ve been generally unhappy with most table leg brackets, so I designed my own. This was pretty complicated, since they were an odd size that I had to cut out of a solid block of ash.
Brackets
I made them to fit the dimensions of the table exactly and use a lag screw to anchor the legs.
bracket dimensions
To build these brackets, I laser cut a jig to ensure I cut the holes at the right places. This worked out really well using my crosscut sled and drill press.
Jig Design
JointingTest FitDrilling HolesCustom Push StickCutting down to sizeCut out inlayfinished brackets
The final desk came together pretty much exactly according to plan.
I wanted to measure the alignment of my table saw blade and fence. You can buy tools that do this using the miter slot track with a dial gauge included for about $70. For example, this one:
Example tooling
This had fairly limited travel and I already had a dial gauge. So I designed one. I did the design while I had my morning coffee before work. I cut it while finishing my email so I built this with about 30 minutes of total time. Lockheed provided access to the laser-cutter and also provides the material for hobby use:
Component
I made the part from 1/4 inch think acrylic and manufactured it on a laser cutter (Glowforge). I tapped two M6 holes (5mm hole diameter) in the back so I could get a secure bolt.
Tool in Action
In order to cut this, I used the arrange and post processing with a 0.1778mm kerf:
The Pocket-Hole Jig 720 uses Kreg Automaxx™ to enable one-motion clamping by simultaneously clamping your workpiece and automatically adjusting to the exact thickness of your material (from 1/2″ to 1 1/2″). GripMaxx™ anti-slip holds the project piece secure. The manual doesn’t address how to pick the right screw length or how to set the stop outside of several settings. This isn’t good enough for me and I need to do some measurements and geometry to figure out the exact length.
Basic Block Geometry
I like to be exact. The standard angle of Kreg pocket holes is 15 degrees, the whole assembly approaches the workpiece at a fixed angle to place the start of the hole in the right distance from the bottom of the board.
If I measure this in Photoshop (after skewing to correct for image distortion), I get an angle of 58.1 degrees.
Measure Tool in Photoshop
If I measure it direct (after accounting for the angle of the desktop by zeroing out the surface), then I get 57 degrees.
Direct Measurement
The most accurate method is to measure.
Measuring \(\theta\) via x,y distances
So I took these measurements and came out with.
s
d
high point (mm)
43.9
54
low
10.1
0
accuracy
0.1
0.1
Δ
33.8
54
θ
57.95648317
So I have two measurements, 57.95648317 degrees or 58 degrees or 57 degrees. I’m going to go with what I measured and refine as I measure real cuts.
The geometry of this is cool: as the timber gets wider, the drill bit translates up at double the degree of the pilot hole to ensure the tip of the drill hits the center of the board. The angle of the jig and wedge is twice the angle of the screw since the drill bit angle has to be half the wedge angle if it is to intersect the base at half the width of the board. It’s designed to bring the tip of the bit out in the center of the edge of the piece being drilled and automatically compensates for the timber thickness. Because of this, if the black slope is 58° from horizontal then it is 32° off vertical and the actual angle of the pocket hole is 16°.
Armed with this angle, I have to do some geometry. Given \(t_{\text{min}}\), \(t_b\) the thickness of the board for pocket holes, \(t_T\) the thickness of the board you are screwing into, \(\theta\), the angle of the jig, and \(s\), the length of the shaft of the screw. We want to find \(D\) the length of the shaft of the pilot screw.
Geometry
From this I can calculate:
$$D = \frac{H}{\cos(15 \deg)} – \frac{s}{2} \text{ or } D = \frac{H}{\cos(\theta/2)} – \frac{s}{2} $$
I automate everything and PDFs are so easy to fill in automatically. It’s a digital world and we just live in it.
It’s so frustrating to get a “protected form” that doesn’t allow for appending documents, automated form completion or even filling out fields. It’s silly because all of these forms are printable. Many years ago, I just created an official form from scratch myself because I couldn’t find it online. Without checking a signature, forms can’t be secure. For example, one could easily write a program that takes a picture and recreates the form. Remember, in the digital world something is either secure from math (crypto) or it isn’t. The security in the transaction is provided by authenticating the user’s email. Even better, let’s move past forms and use the authentication in a website or mobile app.
Let’s do the right thing and move past forms caught in the middle ground of insecure and not usable. I think folks lock forms to prevent changes to the document, don’t do that.
Below is the code to fix this. You will need to use linux or windows subsystem for linux. While I can’t think of an evil use case, use this responsibly. (My goal in doing this is always to fill in data into a form.)
Use a little bash to print with ghostscript. Protip: this isn’t “hacking”, this is just printing.
While I qualified a long time ago on the M9, I never really learned to be a good shot. Now, I’m trying to learn to shoot well and wanted to automatically score my targets, keep the data, and get better.
There are apps to do this, but none of them did what I wanted. One app by Thomas Gabrowski and Justa Mili works off a photo of the target to automatically calculate the score. They also have the capability to analyze shooting groups with Windage, Elevation, Mean Radius and Extreme Spread. They have capabilities to keep track of your previous shooting sessions and monitor progress. The App costs $17.
Developing my own costs my time, but gives me flexibility to work with my targets and my system. It’s also a thing I do: grab challenges that will teach me something. It’s served me well and Matlab makes this all accessible and fun. The other thing is that apps never work exactly right. What if I want the raw data so I can calculate if I’m aiming high or low over time? All this code is on github at https://github.com/tbbooher/automatic_target_scoring.
I told my children that two key skills every digital citizen needs are the ability to process text and images. By processing text, I’m able to tell any story from any reports in digital form. This is often bank statements, hotel stays, fitness stuff or uber rides. By processing images, I’m able to understand and report on things that are happening around me.
In looking around, I found this thesis filled with some good ideas. I reached out to the author and discussed the merits of edge detection vs template matching. He didn’t have his code available. There were several papers but none were really that helpful. It was easier to start building than to spend a lot of time reading other’s approaches.
I knew there would be three steps to this: (1) registering all images to the standard, fixed, image for consistent distance, (2) finding the bullet holes/center and (3) measuring the distances from the center each hole.
Image Registration
This was harder than I thought since most registration is for two similar images. I was used to the ease of Photoshop for rapid registration. It turns out it is a hard problem to register images of different pictures of what are really different scenes, even though the structure is common. Most image registration problems are pictures of the same scene that have been taken at different angles or distances. The picture below makes this clear:
Reference and Real Image
I found two approaches that worked for image registration. The first approach was to extract the red circle and then make the circles match. Here I had to calculate and align the centers, and rescale one image to the size of the other. Color thresholding and imfindcircle were quite useful.
For the more general case, I had to use fitgeotrans which takes the pairs of control points, movingPoints and fixedPoints, and uses them to infer the geometric transformation. It does this by taking the pairs of control points, movingPoints and fixedPoints, and uses them to infer the geometric transformation. After doing this I had a set of images that were all the same size, and all in the same orientation — with bullet holes.
Registered Images
Finding the bullet holes
I was able to use this matlab post to learn that I could sample some colors in photoshop, convert the image to HSV and find shades of gray using some code from Theodoros Giannakopoulos.
The next thing I had to do was create the ability to find the center. I did this by recognizing that the center X is red and pretty distinctive — ideal for template matching using normalized cross-correlation matlab has a great description of how this works here. With this accomplished, I can find the center in a few lines, by going off this template:
Template
All together, I’m able to compute the measurements to make a picture like this (note the green circle in the middle on the X):
Result
With the image registered, the center defined and all holes discovered, I could easily calculate a score of a mean distance to the bullseye.
Problems
The problem was that I couldn’t get good consistency. The shadows were a problem on some images, on others, shots very close to one another caused confusion. It turned out that I was really good at quickly seeing the holes, better than a template matching problem. Note that when I saved the image, I updated a xls file and saved the scores as EXIF data so the image had the exact locations of the holes that I could pull out later if needed. The code below works awesome and is ideal for my solution. Best of all, I learned a lot about how to manipulate and extract data from images.
Results
So, is my shooting getting better? Not yet. In the plot below you can see my score is increasing, and the stDev of my shots is increasing as well. Now, the data aren’t uniform since I had the target at 5m and now have it at 6.5m on Oct 8. Sept 12 was with a suppressed 22 at 5m. Oct 8 was 9mm. Anyway, it’s better to know from data than to be guessing. I’m chalking this up to an improved technique that is taking some time to adjust to.
I needed a torsion table to make sure my Shapeoko XXL had a solid foundation. Dimensions for the Shapeoko are available here. The basic idea of a torsion table is to use two thin layers of material on either side of a lightweight core, usually a grid of beams. Torsion boxes are used in wings and vertical stabilizers. The final product can resist torsion under an applied load. The torsion box uses the properties of its thin surfaces to carry the imposed loads primarily through tension while the close proximity of the enclosed core material compensates for the tendency of the opposite side to buckle under compression.
Marc Spagnuolo, a.k.a. “The Wood Whisperer,” put together a pretty comprehensive 20-minute-plus video on how he built his. Spagnuolo shows you how to get past the dilemma of building your first torsion box, which is: how do you construct a perfectly flat surface, before you’ve got a perfectly flat surface to assemble it on?
My design was intended to look similar to this, but I didn’t like the idea of making sure all the individual pieces were straight, so I build a design based on half-lap joints.
Example Build
Lining everything up perfectly was super easy after cutting the slots. MDF is an amazing material.
My end design looked like this (I always use mm for dimensions).
I used my table saw to cut down the core pieces all to the same height. A torsion box is a completely flat, very sturdy and relatively lightweight surface, and anyone designing anything structural and rectilinear should understand its principles. The concept is simple, even if construction can be tedious: Two flat, horizontal surfaces are sandwiched over a grid of crossmembers, and once the sandwich is glued shut, a rigidity much greater than that possessed by the individual parts is achieved.
Pieces Cut
I added shims to the bottom of the saw horses to make sure the base was level. I used lots of glue.
Assembly drying
By using half-lap cuts I was able to get all the spacing right. It was critical to square the boards.
Half-laps cut
Here you can see my use of pocket screws and my testing of the table to ensure it is level.
Inital design
I love using cutlist optimizer to speed up my cuts and optimize the use of the wood. I’ve designed cutting algorithms myself in the past and this online tool is fast, accurate and excellent. (https://www.cutlistoptimizer.com/). I did make a design change by ripping long strips so I could avoid the inaccuracy of all the small cuts.
Cutlist
I made a video of my design process. I made a component with half-laps cut out, replicated it with design tools, flipped a copy that I rotated 90 degrees.
Kerf Mount Corner Brackets are great. But it takes some thinking if you are working with larger lumber. I recently purchased these and these from amazon.
The corner bracket looks like this:
The trick is figuring out where to cut the leg at 45 degrees and the kerfs, especially if the leg isn’t square. I was going to work out the geometry of this, but instead I measured the bracket, drew a horizontal profile in visio and then measured the geometry. Since the leg isn’t square, I had to decide where the bracket mounts flush. I included my drawing here in the hopes that it may be helpful to you.
“If love is the essence and totality of the good demanded of us, how can it be known that we love?”
Karl Barth
We think in groups and live in tribes. It’s hard to believe anything that doesn’t align with a big group of folks. The historical struggle between economic classes is shifting to a conflict between specific identity groups. This is a consequence of the failure of Marxism in practice. I’ve been given a front-row seat to observe that the power in our culture is increasingly concentrated into a few geographic regions that control business, marketing and media. Old ideas are recycled into weapons to gain political power as new groups align to seek their own self-interest. This leaves a lot of us confused as we try to live authentic and peaceful lives in light of constantly changing goalposts.
One way to view history is by teasing out the changes in hopes and fears. All people are constantly trying to be safe and in control of their lives, and some people (generally the *elite* which has been everything from the church to the secular left) are always trying to control others. It is a modern activity to leverage technology and the marketplace of ideas as a means to power. Since the 15th century, Europe has been the source of radical transformation. The shift from pre-Modernity to Modernity ushered in an era of constant change starting with the Italian Renaissance, followed by the growth of Humanism and the Reformation movement. The colonization of the East and the Americas, the Enlightenment, the French Revolution, new nationalistic states and the Industrial Revolution made all this spin faster. However, nothing accelerated things more than technology and the ability to record and share scientific knowledge. (cf Karl Barth, Die Protestantische Theologie im 19. Jahrhundert)
Things looked rosy for America and the West at the start of the 20th century. Scientists performed Miracles. Automobiles, modern factories, new medicines and aircraft gave the news a constant stream of novel wonders to share. Western countries were confident of their superiority as they reached the zenith of their political and economic power. This was coincident with an age where many theologians were optimistically convinced of man’s natural ability to know God and speak about God. They believed theology needed to be as “scientific” as all the other sciences. They were convinced that it would be possible to speak about God in scientific terms, based on the innate qualities of humanity. Human reason, experience, morality and history became the foundation of religious discourse. There were no doubts about our ability to improve and reshape society with the aid of scientific knowledge. Scientists were convinced that unlimited progress would create a better and brighter future for all people. Dreamers were in vogue reading novels such as Jules Verne’s, From the Earth to the Moon (De la terre à la lune), — the story of the Baltimore Gun Club and their attempts to build an enormous space gun which could launch the club’s president and a French poet to the moon.
Onward!
World War I changed everything. Optimism was replaced by fear, and by the knowledge that science and technology not only facilitated the progress and well-being of humanity, but also the devastation of society and the destruction of humanity. This realization caused a major crisis in European society.
It was this crisis that led to our current discussion of critical race theory, which is an offshoot of critical theories that trace back to intellectuals, academics, and political dissidents dissatisfied with the contemporary socio-economic systems (capitalist, fascist, communist) of the 1930s. The Frankfurt School was an ideological consolation prize for the Marxists of the failed German Revolution of 1918-19, in the same way that Woke Progressivism was a consolation prize for those of the failed Revolution of ‘68. It was originally located at the Institute for Social Research (Institut für Sozialforschung), an attached institute at the Goethe University in Frankfurt, Germany. The Institute was founded in 1923 thanks to a donation by Felix Weil with the aim of developing Marxist studies in Germany. After 1933, the Nazis forced its closure, and the Institute was moved to the United States where it found hospitality at Columbia University in New York City. The Frankfurt theorists proposed that social theory was inadequate for explaining the turbulent political factionalism and reactionary politics that arose from 20th century liberal capitalist societies. Criticism of capitalism and of Marxism–Leninism as philosophically inflexible systems of social organization, the School’s critical theory research indicated alternative paths to realizing the social development of a society and a nation.
The academic influence of the critical method is far reaching. Some of the key issues and philosophical preoccupations of the School involve the critique of modernity and capitalist society, the definition of social emancipation, as well as the detection of the pathologies of society.
The legacy of the Frankfurt School is Critical Theory, which is a full-fledged philosophical and sociological movement spread across many universities around the world. Critical Theory provides a specific interpretation of Marxist philosophy with regards to some of its central economic and political notions like commodification, reification, fetishization and critique of mass culture. Marxism led to the Frankfurt School, which led to Critical Theory, followed by Critical Legal Studies, and finally Critical Race Theory. The end result today of all this in the public square is a post-modern struggle between culture and races that emphasizes lived experience over liberal argumentation and truth discovery. When people often talk past each other, they are failing to realize that they operate in wholly different truth systems.
Dudes with Ideas
In emphasizing lived experience over other sources of truth such as science and reason, everything is viewed as a racial power struggle. Philosophically, we trade Kant’s logical system for Foucault’s rejection of the knowability of anything. Marx’s fervent calls for bloody class warfare are replaced with an equally fervent focus on inter-racial dynamics as CRT assumes a priori that racism is present in everything under a doctrine known as “systemic racism.”
Karl Barth thinking and writing
Enter Karl Barth (1886-1968), the local pastor of the small industrial town of Safenwil in the Swiss canton of Aargau. A fascinating fellow, he is no evangelical, but is the father of neo-orthodoxy and crisis theology. He addressed critical theory with a focus on the sinfulness of humanity, God’s absolute transcendence, and the human inability to know God except through revelation. The critical nature of his theology came to be known as “dialectical theology,” or “the theology of crisis.” This initiated a trend toward neo-orthodoxy in Protestant theology. The neo-orthodoxy of Karl Barth reacted strongly against liberal Protestant neglect of historical revelation. He wanted to lead theology away from the influence of modern religious philosophy, with its emphasis on feeling and humanism, and back to the principles of the Reformation and the teachings of the Bible.
Karl Barth presciently used the modern language of Wokeness in his defense of orthodoxy. He defined the entire life of Christian discipleship as people who are continually reawakened – continuous repentance, continuous transformation, continuous renewal. Barth was careful to say that Christians aren’t the people who are awake vs. everybody else who’s asleep. Christians are those who constantly stand in need of reawakening from the sleep of all kinds of errors and “fantasies and falsehoods.” To Barth, we have to be on guard so we don’t fall asleep to what’s true, and what’s coming to us in Jesus’ way of love and peace.
Barth departed from evangelicals in his view that the Bible not as the actual revelation of God but as only the record of that revelation. To Barth, God’s single revelation occurred in Jesus Christ. In short, Barth rejected two main lines of interest in Protestant theology of that time: historical criticism of the Bible and attempt to find justification for religious experience from philosophy and other sources. Barth saw in historical criticism great value on its own level, but it often led Christians to lessen the significance of the testimony of the apostolic community to Jesus as being based on faith and not on history. Theology that uses philosophy is always on the defensive and more anxious to accommodate the Christian faith to others than to pay attention to what the Bible really says.
“The person who knows only his side of the argument knows little of that.” — Karl Barth
Barth stays out of the evangelical camp due to his view of the individual’s role in scriptural interpretation. John Calvin, by contrast, emphasizes the inspiration of Scripture, the text itself being God-breathed, regardless of whether or how believers receive it. Barth prefers to speak of the out-breathing of the Spirit of God in both the text and the believer, thus distancing himself both from the exegesis of Scripture and from the Reformed tradition.
However, Barth is a bold defender of the rights of the individual and for the goodness of self-criticism. One of my favorite Barth stories tells of a letter he received which said Professor Barth, I have discovered the following contradictions in your writings, what do you say about these contradictions? And Barth ostensibly wrote back and said: Well, here are some others. And lists a few more contradictions. Yours faithfully . . . This is a powerful statement of the liberal idea of welcoming self-criticism.
This is why I find such joy in revisiting Karl Barth. He passes my “coffee test” where I know I would enjoy a sit-down with him. He combines love and grace with an intense pursuit of the truth and then dares to think original thoughts. The fact he doesn’t fit in my American Evangelical tribe is a welcome bonus. I’m pretty sure everything I believe is wrong in some way. Both my orthodox theology, my teleology and my scientific worldview compel me to admit that every tenant I hold should be tested and improved. This is why I love voices that start with grace and end with brilliance. I’m open to change and hunger to learn, but skeptical of political agendas. I’m aware that history is the story of power politics. Oppression is real, but doesn’t belong to one identity. Insight and wisdom are real, but don’t belong to one group. He shares that we are all equally guilty, and equally deserving of grace. Karl Barth preached, wrote and shared his wisdom by inviting others to learn. He and I share the same loves (wisdom, Jesus, learning and talking) and many of the same convictions (that grace and redemption are real, possible and freely available). I’m glad he took to the time to share his thoughts as they are a great comfort in times such as these.
Testing HTTPs locally is always hard, but I’m against testing on production or even on a remote server.
Things are also complicated by developing in linux as a subsystem on windows via WSL2. I was able to use mkcert to get ssl to work locally.
While I would love to use Let’s Encrypt locally, Let’s Encrypt can’t provide certificates for “localhost” because nobody uniquely owns it. Because of this, they recommend you generate your own certificate, either self-signed or signed by a local root, and trust it in your operating system’s trust store. Then use that certificate in your local web server. They describe this well on their website.
Using certificates from real certificate authorities (CAs) for development can be dangerous or impossible (for hosts like example.test, localhost or 127.0.0.1), but self-signed certificates cause trust errors. Managing my own CA may be the best solution, but mkcert automatically creates and installs a local CA in the system root store, and generates locally-trusted certificates. I was able to modify my nginx.conf with the my container test environment and open the necessary ports in docker-compose (- 443:443) to get this working just fine.
You can see my working code here on a new git branch.
I have to build and interact with things to understand them. At MIT, we focused on combination of thought and practical action to solve current and future problems and our motto (mens et manus) combines application (hand) and theory (mind).
By day, I’m leading adoption of containers and automation technologies to drive big changes in enabling software reusability. By night, I’m using containers to teach me new programming languages, interfaces and networking concepts. Last week, I wanted to learn how reverse proxies work and wanted to use containers and some familiar technology express, flask, nginx and docker to help me.
First, I’m sharing this because I wish this existed out there to learn from, so please head to gitlab https://gitlab.com/tim284/nginx_proxy_test and clone this and let me know if you do anything awesome with it.
Because I like Plutarch in general and studying the Battle of Thermopylae in particular, you will notice a theme. (Please, the movie is all cool, but nothing close to reading the Gates of Fire.)
Containers
Containers are a solution to the problem of how to get software to run reliably when moved from one computing environment to another. The basic idea is to have the complexity and overhead that you want. Instead of using a full operating system in a virtual machine, you can use the bits you want and need. A container consists of an entire runtime environment: an application, plus all its dependencies, libraries and other binaries, and configuration files needed to run it, bundled into one package. By containerizing the application platform and its dependencies, differences in OS distributions and underlying infrastructure are abstracted away.
Architecture
I created two express applications, one Flask and one static site. I run the active apps in containers and use nginx reverse proxy to present them. The static HTML page connects through a docker volume. Free Code Camp wrote a nice tutorial that explains this type of setup.
One of the key concepts I had to learn was how networking works in Docker. This seemed like a right of passage I needed to know to work with containers in general. This article helped me a lot.
Docker
Complexity continues to increase with continuous new introduction of new programming languages, hardware, architectures, frameworks, and discontinuous interfaces between tools for each lifecycle stage. Containers allow you to focus on what you are building and quickly adapt new technology. Most important, I can quickly change and reuse things to learn a lot quickly. It can be hard to remain a full stack developer, a dad and a business leader. Docker simplifies a lot of things I don’t have time to learn and accelerates my workflow to allow me to experiment and innovate with different tools, application stacks, and deployment environments.
For this experiment, I use docker-compose to pull in images of nginx, flask, express and the redis database.
NGINX
Nginx is a popular web server (23.21% of sites) that can also be used as a reverse proxy, load balancer, mail proxy and HTTP cache. The software was created by Igor Sysoev and publicly released in 2004. A company of the same name was founded in 2011 to provide support and Nginx Plus paid software. In March 2019, the company was acquired by F5 Networks for $670 million. What a crazy startup idea: take an open source project, improve and support it and start a company (github, Nginx, etc).
Nginx is built to handle many concurrent connections at the same time. It can handle more than 10,000 simultaneous connections with a low memory footprint (~2.5 MB per 10k inactive HTTP keep-alive connections). This makes it ideal for being the point-of-contact for clients. The server can pass requests to any number of backend servers to handle the bulk of the work, which spreads the load across your infrastructure. This design also provides you with flexibility in easily adding backend servers or taking them down as needed for maintenance.
Another instance where an http proxy might be useful is when using an application servers that might not be built to handle requests directly from clients in production environments. Many frameworks include web servers, but most of them are not as robust as servers designed for high performance like Nginx. Putting Nginx in front of these servers can lead to a better experience for users and increased security. This post from Digital Ocean is awesome at explaining all of this.
Reverse Proxy
A proxy means that information is going through a third party, before getting to the location. Why use it? For example, if you don’t want a service to know your IP, you can use a proxy. A proxy is a server that has been set up specifically for this purpose. If the proxy server you are using is located in, for example, Amsterdam, the IP that will be shown to the outside world is the IP from the server in Amsterdam. The only ones who will know your IP are the ones in control of the proxy server.
Proxying in Nginx is accomplished by manipulating a request aimed at the Nginx server and passing it to other servers for the actual processing. The result of the request is passed back to Nginx, which then relays the information to the client. The other servers in this instance can be remote machines, local servers, or even other virtual servers defined within Nginx. The servers that Nginx proxies requests to are known as upstream servers.
A reverse proxy, by contrast, will not mask outgoing connections (you accessing a webserver), it will mask the incoming connections (people accessing your webserver). You simply provide a URL like example.com, and whenever people access that URL, your reverse proxy will take care of where that request goes.
Here I’m using a reverse proxy so I can have services running on a several ports, but I only expose ports 80 and 443, HTTP and HTTPS respectively. All requests will be coming into my network on those two ports, and the reverse proxy will take care of the rest.
Nginx can proxy requests to servers that communicate using the http(s), FastCGI, SCGI, and uwsgi, or memcached protocols through separate sets of directives for each type of proxy. The Nginx instance is responsible for passing on the request and massaging any message components into a format that the upstream server can understand.
My Nginx config below allowed me to proxy_pass to the upstream servers that Docker created.
First, I need some apps to serve content and in order to make sure I’m understanding how to proxy to different services, I use both JavaScript (Express) and Python (Flask).
Express
Express.js, or simply Express, is a back end web application framework for Node.js, released as free and open-source software under the MIT License. It is designed for building web applications and APIs. It has been called the de facto standard server framework for Node.js. I like it because I can create a web application in several lines.
Flask App
Flask is the most minimal python web application framework. My app is bare-bones simple and just returns some basic text.
Static Page
In order to test the most basic feature of nginx, I build a static page.
Results
First I want to see all of my containers running:
➜ proxy_test git:(master) ✗ docker ps
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
ac83546944da nginx "/docker-entrypoint.…" 5 minutes ago Up 5 minutes 0.0.0.0:80->80/tcp, :::80->80/tcp proxy_test_webserver_1
36d93cfe4a41 proxy_test_flask "/bin/sh -c 'python …" 7 hours ago Up 7 hours 0.0.0.0:5000->5000/tcp, :::5000->5000/tcp proxy_test_flask_1
30f29133b42a proxy_test_lochagus "docker-entrypoint.s…" 7 hours ago Up 7 hours 0.0.0.0:49160->8080/tcp, :::49160->8080/tcp proxy_test_lochagus_1
0d1a98a27730 proxy_test_leonidas "docker-entrypoint.s…" 7 hours ago Up 7 hours 0.0.0.0:49161->8080/tcp, :::49161->8080/tcp proxy_test_leonidas_1
4770b2726dfd redis "docker-entrypoint.s…" 7 hours ago Up 7 hours 6379/tcp proxy_test_redis_1
and, does it work? curl -i localhost produces my static page. but most important, curl -i localhost/flask produces dynamic content.
HTTP/1.1 200 OK
Server: nginx/1.21.0
Date: Sun, 06 Jun 2021 10:09:03 GMT
Content-Type: text/html; charset=utf-8
Content-Length: 51
Connection: keep-alive
This Compose/Flask demo has been viewed 23 time(s).%
Next Steps
This is just the beginning. Next, I’m going to use gitlab to deploy all of this let’s encrypt to secure all of the traffic and more.