(Don't ask me how i was able to produce it with their complex GUI interface, because i couldn't do it twice)

Blender is a Linuxoid product. Linuxoids do not have GUI design skills or tradition, and GUIs of Blender level of complexity are certainly beyond their capabilities. And it is a real pity because functionally, Blender is a 100% professional software suite and a very good one at it.
Someone should do a great service to the humanity and go down in history by reworking Blender's crooked GUI to follow the principles of ergonomics and user-friendliness implemented in other similar editing environments like, say, Autodesk 3ds Max or Adobe Photoshop. The Blender source code is open to everyone, after all.

... we are concerned only by the red border ...
No, not in the least, my friend.
On the contrary, we are concerned by every line intersection point you're seeing in your upper UV map. These points (called UVs, or UV pairs) are the projections of model 3D vertices onto the texture map plane, and each flat triangle you're seeing is consequently the projection of a corresponding 3D mesh polygon (triangular rather than quadric in our case) onto the same plane. U stands for the plane X coordinate in a more customary high-school notation while V stands for the Y coordinate. Seldom used 3D textures also distinguish a third W coordinate to denote the texture "depth" level similar to the Z coordinate in our customary 3D space. 3D textures are sometimes used to efficiently colorize virtual volumes like sky clouds, or clouds of smoke, or street light volumes ("shafts").
In the most general case, a jaggie (or any other unwanted artifact) may occur at any point within the texture map borders. So, we should be able to drag an arbitrary triangular wireframe intersection point (i.e. some polygon corner or apex a.k.a. "vertex" in traditional 3D lingo), or some of them, manually a little around the nearby portions of texture map to try and hide the artifact from direct view if at all possible.
You've been evidently misled by the visible similarity of wireframe "islands" in the lower texture map with your USA outlines in your Tor demo video.

There's absolutely no problem to "project" the UVs written in the OBJ file and stored in their corresponding OR 3D vertex structures onto the texture plane. The UV (or XY, if you prefer) coordinates have already been pre-calculated and normalized to the [0.f,1.0f] range for you at model design time in the model designer's UV unwrapper software. All you have to do is multiply the floating-point U value by the texture pixel width, and V value, by the texture pixel height, and plot the resulting pixel point (a.k.a. apex, corner, vertex, UV, UV pair) in the texture map or against it if it is just a background "layer". However in practice it appears easier to denote the UVs as 3x3 px large boxes rather than just pixels (texels) in both 2D and 3D to be able to easily aim at them and select them for manual dragging as shown below.
(In Xandreale, I can select the UVs of interest (shown in red) in both 3D and 2D viewports whichever I find the easiest. What cannot be done in the existing 3D editing software but can be done in Xandreale is that you can walk up to any object of interest in the 3D viewport using the FPS (first-person-shooter) camera and mark manually any single vertex in the scene that you would like to work with in your current scene editor.)OTOH drawing the associated wireframe map is more difficult in 2D GDI where I need to prepare a special UV (vertex, apex, etc) array in triples following the vertex indices that OpenGL uses to draw its GL_TRIANGLES (or GL_QUADS) based on the VBA or VBO data, and use a series of Polygon() API calls (or asingle PolyPolygon() call) to draw the GDI equivalent of OpenGL wireframe projection on the texture map plane. Practice shows that drawing 30K of UVs using Polygon() is too slow due to the function call overhead while one single call to PolyPolygon() is almost as fast as its OpenGL glDrawElements(GL_TRIANGLES, ...) equivalent is for a VBA of polygons.