 The diameter of the M87 black hole emission ring, 54.8 million light-years away, is 43 millionths of an arc second. The resolving power of a telescope is the smallest angular distance between two objects that can be seen as separate objects. A telescope's resolving power is limited by its baseline, the diameter of its lens. The 2.4-meter Hubble Space Telescope has the resolving power to see down close to 50 milli arc seconds. The M87 black hole is over a thousand times too small for Hubble to see. The 120-meter ESO, very large telescope interferometer, used to study SAGA star, has the resolving power to see down to 2 milli arc seconds. The M87 black hole is almost 50 times too small for ESO VLTI to see. To get down to the micro arc second level, a much bigger telescope is needed. In order to get a telescope baseline large enough, the EHT team upgraded and connected a worldwide network of eight pre-existing radio telescopes deployed at a variety of high altitude sites. These locations included volcanoes in Hawaii and Mexico, mountains in Arizona, Spain and Chile, and one in Antarctica. Baseline lengths between telescopes vary, but the resulting effective combined baseline is close to the diameter of the Earth. This results in an array with a resolution limit of around 21 micro arc seconds. This is enough to see the M87 black hole's emission ring with its shadow. Although the telescopes are not physically connected, they are able to synchronize their recorded data with atomic clocks which precisely time their observations. Each telescope produced roughly 350 terabytes of data per day. This data was stored on high performance hard drives and flown to highly specialized supercomputers at the MIT Haystack Observatory and the Max Planck Institute for Radio Astronomy where the data was combined. In science, the inverse problem is the process of calculating from a set of observables back to the causal factors that produced them. It is always a challenge to figure out what was actually happening at the source of the collected data. For example, the ESO-VLT interferometer team reconstructed the motion of S2 around SAG-A star in the UV plane 26,000 light years away by applying interferometry techniques to images captured by the telescope in the observation plane here on Earth. The EHT directly measured the radio brightness distribution on the sky called visibilities. The inverse problem is to calculate what caused this distribution and produce an image. Radio data is normally quite dirty. One of the primary radio astronomy interferometry techniques is a cleaning algorithm called Clean. It can take this dirty image and do this. However, there were two challenging issues with the EHT data. First, samples were limited to only a few hours a day for four days. Because the source plane is only sparsely sampled, the inverse problem is under constrained, that is the number of potential causes that can lead to these visibilities expands dramatically. And second, the measured visibilities had large amplitude calibration uncertainties. To address these challenges, a new imaging algorithm was developed that incorporated additional assumptions and constraints designed to produce images that are physically plausible while remaining consistent with the data. The algorithm is called regularized maximum likelihood, RML for short. It searches for an image that is not only consistent with the observed data, but also favors specific image properties like smoothness and compactness. A library of tens of thousands of images was created with different parameter combinations associated with general relativistic magnetohydrodynamics. Using both clean and RMT on the visibilities, this image was developed. It supports Einstein's general theory of relativity, in part because it assumed general relativity in order to get the image. The visibilities could have been explained by some other theory, but without relativity, they just produced static.