Learning General Relativity

Math blogger Joseph Nebus does another A – Z series of posts, explaining technical terms in mathematics. He asked readers for their favorite pick of things to be covered in this series, and I came up with General Covariance. Which he laid out in this post – in his signature style, using neither equations nor pop-science images like deformed rubber mattresses – but ‘just words’. As so often, he manages to explain things really well!

Actually, I asked for that term as I am in the middle of yet another physics (re-)learning project – in the spirit of my ventures into QFT a while back.

Since a while I have now tried (on this blog) to cover only the physics related to something I have both education in and hands-on experience with. Re General Relativity I have neither: My PhD was in applied condensed-matter physics – lasers, superconductors, optics – and this article by physicist Chad Orzel about What Math Do You Need For Physics? covers well what sort of math you need in that case. Quote:

I moved into the lab, and was concerned more with technical details of vacuum pumps and lasers and electronic circuits and computer data acquisition and analysis.

So I cannot find the remotest way to justify why I would need General Relativity on a daily basis – insider jokes about very peculiarly torus-shaped underground water/ice tanks for heat pumps aside.

My motivation is what I described in this post of mine: Math-heavy physics is – for me, that means a statistical sample of 1 – the best way of brazing myself for any type of tech / IT / engineering work. This positive effect is not even directly related to math/physics aspects of that work.

But I also noticed ‘on the internet’ that there is a community of science and math enthusiasts, who indulge in self-studying theoretical physics seriously as a hobby. Often these are physics majors who ended up in very different industry sectors or in management / ‘non-tech’ jobs and who want to reconnect with what they once learned.

For those fellow learners I’d like to publish links to my favorite learning resources.

There seem to be two ways to start a course or book on GR, and sometimes authors toggle between both modes. You can start from the ‘tangible’ physics of our flat space (spacetime) plus special relativity and then gradually ‘add a bit of curvature’ and related concepts. In this way the introduction sounds familiar, and less daunting. Or you could try to introduce the mathematical concepts at a most rigorous abstract level, and return to the actual physics of our 4D spacetime and matter as late as possible.

The latter makes a lot of sense as you better unlearn some things you took for granted about vector and tensor calculus in flat space. A vector must no longer be visualized as an arrow that can be moved around carelessly in space, and one must be very careful in visualizing what transforming coordinates really means.

For motivation or as an ‘upper level pop-sci intro’…

Richard Feynman’s lecture on curved space might be a very good primer. Feynman explains what curved space and curved spacetime actually mean. Yes, he is using that infamous beetle on a balloon, but he also gives some numbers obtained by back-of-the-envelope calculations that explain important concepts.

For learning about the mathematical foundations …

I cannot praise these Lectures given at the Heraeus International Winter School Gravity and Light 2015 enough. Award-winning lecturer Frederic P. Schuller goes to great lengths to introduce concepts carefully and precisely. His goal is to make all implicit assumptions explicit and avoid allusions to misguided ‘intuitions’ one might got have used to when working with vector analysis, tensors, gradients, derivatives etc. in our tangible 3D world – covered by what he calls ‘undergraduate analysis’. Only in lecture 9 the first connection is made back to Newtonian gravity. Then, back to math only for some more lectures, until finally our 4D spacetime is discussed in lecture 13.

Schuller mentions in passing that Einstein himself struggled with the advanced math of his own theory, e.g. in the sense of not yet distinguishing clearly between the mathematical structure that represents the real world (a topological manifold) and the multi-dimensional chart we project our world onto when using an atlas. It is interesting to pair these lectures with this paper on the history and philosophy of general relativity – a link Joseph Nebus has pointed to in his post on covariance.

Learning physics or math from videos you need to be much more disciplined than with plowing through textbooks – in the sense that you absolutely have to do every single step in a derivation on your own. It is easy to delude oneself that you understood something by following a derivation passively, without calculating anything yourself. So what makes these lectures so useful is that tutorial sessions have been recorded as well: Tutorial sheets and videos can be found here.
(Edit: The Youtube channel of the event has not all the recordings of the tutorial sessions, only this conference website has. It seems the former domain does not work any more, but the content is perserved at gravity-and-light.herokuapp.com)

You also find brief notes for these lectures here.

For a ‘physics-only’ introduction …

… I picked a classical, ‘legendary’ resource: Landau and Lifshitz give an introduction to General Relativity in the last third of the second volume in their Course of Theoretical Physics, The Classical Theory of Fields. Landau and Lifshitz’s text is terse, perhaps similar in style to Dirac’s classical introduction to quantum mechanics. No humor, but sublime and elegant.

Landau and Lifshitz don’t need manifolds nor tangent bundles, and they use the 3D curvature tensor of space a lot in addition to the metric tensor of 4D spacetime. They introduce concepts of differences in space and time right from the start, plus the notion of simultaneity. Mathematicians might be shocked by a somewhat handwaving, ‘typical physicist’s’ way to deal with differentials, the way vectors on different points in space are related, etc. – neglecting (at first sight, explore every footnote in detail!) the tower of mathematical structures you actually need to do this precisely.

But I would regard Lev Landau sort of a Richard Feynman of The East, so it takes his genius not make any silly mistakes by taking the seemingly intuitive notions too literally. And I recommend this book only when combined with a most rigorous introduction.

For additional reading and ‘bridging the gap’…

I recommend Sean Carroll’s  Lecture Notes on General Relativity from 1997 (precursor of his textbook), together with his short No-Nonsense Introduction to GR as a summary. Carroll switches between more intuitive physics and very formal math. He keeps his conversational tone – well known to readers of his popular physics books – which makes his lecture notes a pleasure to read.

Artist's concept of general relativity experiment (Public Domain, NASA, Wikimedia)

__________________________________

So this was a long-winded way to present just a bunch of links. This post should also serve as sort of an excuse that I haven’t been really active on social media or followed up closely on other blogs recently. It seems in winter I am secluding myself from the world in order to catch up on theoretical physics.

Random Things I Have Learned from My Web Development Project

It’s nearly done (previous episode here).

I have copied all the content from my personal websites, painstakingly disentangling snippets of different ‘posts’ that were physically contained in the same ‘web page’, re-assigning existing images to them, adding tags, consolidating information that was stored in different places. Raking the Virtual Zen Garden – again.

New website: A 'post.'

Draft of the layout, showing a ‘post’. Left and right pane vanish in responsive fashion if the screen gets too small.

… Nothing you have not seen in more elaborate fashion elsewhere. For me the pleasure is in creating the whole thing bottom up not using existing frameworks, content management systems or templates – requiring an FTP client and a text editor only.

I spent a lot of time on designing my redirect strategy. For historical reasons, all my sites use the same virtual web server. Different sites have been separated just by different virtual directories. So in order to display the e-stangl.at content as one stand-alone website, a viewer accessing e-stangl.at is redirected to e-stangl.at/e/. This means that entering [personal.at]/[business] would result in showing the business content at the personal URL. In order to prevent this, the main page generation script used checks for the virtual directory and redirects ‘bottom-up’ to [business.at]/[business].

In the future, I am going to use a new hostname for my website. In addition, I want to have the option to migrate only some applications while keeping the others tied to the old ASP scripts temporarily. This means more redirect logic, especially as I want to test all the redirects. I have a non-public test site on the same server, but I have never tested redirects as it means creating loads of test host names; but due to the complexity of redirects to come I added names like wwwdummy for every domain, redirecting to my new main test host name, in the same way as the www URLs would redirect to my new public host name.

And lest we forget I am obsessed with keeping old URLs working. I don’t like it if websites are migrated to a new content management system, changing all the URLs. As I mentioned before, I already use ASP.NET Routing for having nice URLs with the new site: A request for /en/2014/10/29/some-post-title does not access a physical folder but the ‘flat-file database engine’ I wrote from scratch will search for the proper content text file based on a SQL string handed to it, retrieve attributes from both file name and file content, and display HTML content and attributes like title and thumbnail image properly.

New website: Flat-file database.

Flat-file database: Two folders, ‘pages’ and ‘posts’. Post file names include creation date, short relative URL and category. Using the ascx extension (actually for .NET ‘user controls’ as the web server will not return these files directly but respond with 404. No need to tweak permissions.)

The top menu, the tag cloud, the yearly/monthly/daily archives, the list of posts on the Home page, XML RSS Feed and XML sitemap  are also created by querying these sets of files.

New web site: File / database entry

File representing a post: Upper half – meta tags and attributes, lower half – after attribute ‘content’: Actual content in plain HTML.

Now I want to redirect from the old .asp files (to be deleted from the server at some point in the future) to these nice URLs. My preferred solution for this class of redirects is using a rewrite map hard-coded in the web server’s config file. From my spreadsheet documentation of the 1:n relation of old ASP pages to new ‘posts’ I have automatically created the XML tags to be inserted in the ‘rewrite map’.

Now the boring part is over and I scared everybody off (But just in case you can find more technical information on the last update on the English version of all website, e.g. here) …

… I come up with my grand insights, click-bait X-Things-You-Need-To-Know-About-Seomthing-You-Should-Not-Do-and-Could-Not-Care-Less-Style:

It is sometimes painful to read really old content, like articles, manifestos and speeches from the last century. Yet I don’t hide or change anything.

After all, this is perhaps the point of such a website. I did not go online for the interaction (of social networks, clicks, likes, comments). Putting your thoughts out there, on the internet that does never forget, is like publishing a book you cannot un-publish. It is about holding yourself accountable and aiming at self-consistency.

I am not a visual person. If I would have been more courageous I’d use plain Courier New without formatting and images. Just for the fun of it, I tested adding dedicated images to each post and creating thumbnails from them – and I admit it adds to the content. Disturbing, that is!

I truly love software development. After a day of ‘professional’ software development (simulations re physics and engineering) I am still happy to plunge into this personal web development project. I realized programming is one of the few occupations that was part of any job I ever had. Years ago, soul-searching and preparing for the next career change, I rather figured the main common feature was teaching and know-how transfer – workshops and acedemic lectures etc. But I am relieved I gave that up; perhaps I just tried to live up to the expected ideal of the techie who will finally turn to a more managerial or at least ‘social’ role.

You can always find perfect rationales for irrational projects: Our web server had been hacked last year (ASP pages with spammy links put into some folders) and from backlinks in the network of spammy links I conclude that classical ASP pages had been targeted. My web server was then hosted on Windows 2003, as this time still fully supported. I made use of Parent Paths (../ relative URLs) which might have eased the hack. Now I am migrating to ASP.NET with the goal to turn off Classical ASP completely, and I already got rid of the Parent Paths requirement by editing the existing pages.

This website and my obsession with keeping the old stuff intact reflects my appreciation of The ExistingBeing Creative With What You Have. Re-using my old images and articles feels like re-using our cellar as a water tank. Both of which are passions I might not share with too many people.

My websites had been an experiment in compartmentalizing my thinking and writing – ‘Personal’, ‘Science’, ‘Weird’, at the very beginning the latter two were authored pseudonymously – briefly. My wordpress.com blog has been one quick shot at Grand Unified Theory of my Blogging, and I could not prevent my personal websites to become more an more intertwined, too, in the past years. So finally both do reflect my reluctance of separating my personal and professional self.

My website is self-indulgent – in content and in meta-content. I realize that the technical features I have added are exactly what I need to browse my own stuff for myself, not necessarily what readers might expect or what is considered standard practice. One example is my preference for a three-pane design, and for that infinite (no dropdown-menu) archive.

New website: Category page.

Nothing slows a website down like social media integration. My text file management is for sure not the epitome of efficient programming, but I was flabbergasted by how fast it was to display nearly 150 posts at once – compared to the endless sending back and forth questionable stuff between social networks, tracking, and ad sites (watch the status bar!).

However, this gives me some ideas about the purpose of this blog versus the purpose of my website. Here, on the WordPress.com blog, I feel more challenged to write self-contained, complete, edited, shareable (?) articles – often based on extensive research and consolidation of our original(*) data (OK, there are exceptions, such as this post), whereas the personal website is more of a container of drafts and personal announcements. This also explains why the technical sections of my personal websites contain rather collections of links than full articles.

(*)Which is why I totally use my subversive sense of humour and turn into a nitpicking furious submitter of copyright complaints if somebody steals my articles published here, on the blog. However, I wonder how I’d react if somebody infringed my rights as the ‘web artist’ featured on subversiv.at.

Since 15 years I spent a lot of time on (re-)organizing and categorizing my content. This blog has also been part of this initiative. That re-organization is what I like websites and blogs for – a place to play with structure and content, and their relationship. Again, doing this in public makes me holding myself accountable. Categories are weird – I believe they can only be done right with hindsight. Now all my websites, blogs, and social media profiles eventually use the same categories which have evolved naturally and are very unlike what I might have planned ‘theoretically’.

Structure should be light-weight. I started my websites with the idea of first and second level ‘menu’s and hardly any emphasis on time stamps. But your own persona and your ideas seem to be moving targets. I started commenting on my old articles, correcting or amending what I said (as I don’t delete, see above). subversiv.at has been my Art-from-the-Scrapyard-Weird-Experiments playground, before and in addition to the Art category here and over there I enjoyed commenting in English on German articles and vice versa. But the Temporal Structure, the Arrow of Time was stronger; so I finally made the structure more blog-like.

Curated lists … were most often just ‘posts’. I started collecting links, like resources for specific topics or my own posts written elsewhere, but after some time I did not considered them so useful any more. Perhaps somebody noticed that I have mothballed and hidden my Reading list and Physics Resources here (the latter moved to my ‘science site’ radices.net – URLs do still work of course). Again: The arrow of time wins!

I loved and I cursed the bilingual nature of all my sites. Cursed, because the old structure made it too obvious when the counter-part in the other language was ‘missing’; so it felt like a translation assignment. However, I don’t like translations. I am actually not even capable to really translate the spirit of my own posts. Sometimes I feel like writing in English, sometimes I feel like writing in German. Some days or weeks or months later I feel like reflecting in the same ideas, using the other language. Now I came up with that loose connection of an English and German article, referencing each other via a meta attribute, which results in an unobtrusive URL pointing to the other version.

Quantitative analysis helps to correct distorted views. I thought I wrote ‘so much’. But the tangle of posts and pages in the old sites obscured that actually the content translates to only 138 posts in German and 78 in English. Actually, I wrote in bursts, typically immediately before and after an important change, and the first main burst 2004/2005 was German-only. I think the numbers would have been higher had I given up on the menu-based approach earlier, and rather written a new, updated ‘post’ instead of adding infinitesimal amendments to the existing pseudo-static pages.

Analysing my own process of analysing puts me into this detached mode of thinking. I have shielded myself from social media timelines in the past weeks and tinkered with articles, content written long before somebody could have ‘shared’ it. I feel that it motivates me again to not care about things like word count (too long), target groups (weird mixture of armchair web psychology and technical content), and shareability.

My Flat-File Database

A brief update on my web programming project.

I have preferred to create online text by editing simple text files; so I only need a text editor and an FTP client as management tool. My ‘old’ personal and business web pages are currently created dynamically in the following way:
[Code for including a script (including other scripts)]
[Content of the article in plain HTML = inner HTML of content div]
[Code for writing footer]

The main script(s) create layout containers, meta tags, navigation menus etc.

Meta information about pages or about the whole site are kept in CSV text files. There are e.g. files with tables…

  • … listing all of pages in each site and their attributes – like title, key words, hover texts for navigation links or
  • … tabulating all main properties of all web sites – such as ‘tag lines’ or the name of the CSS file.

A bunch of CSV files / tables can be accessed like a database by defining the columns in a schema.ini file, and using a text driver (on my Windows web server). I am running SQL queries against these text files, and it would be simple to migrate my CSV files to a grown-up database. But I tacked on RSS feeds later; these XML files are hand-crafted and basically a parallel ‘database’.

This CSV file database is not yet what I mean by flat-file database: In my new site the content of a typical ‘article file’ should be plain text, free from code. All meta information will be included in each file, instead of putting it into the separate CSV files. A typical file would look like this:

title: Some really catchy title
headline: Some equally catchy, but a bit longer headline
date_created: 2015-09-15 11:42
date_changed: 2015-09-15 11:45
author: elkement
[more properties and meta tags]
content:
Text in plain HTML.

The logic for creating formatted pages with header, footer, menus etc. has to be contained in code separate from these files; and text files needs to be parsed for meta data and content. The set of files has effectively become ‘the database’, the plain text content being just one of many attributes of a page. Folder structure and file naming conventions are part of the ‘database logic’.

I figured this was all an unprofessional hack until I found many so-called flat-file / database-less content management systems on the internet, intended to be used with smaller sites. They comprise some folders with text files, to be named according to a pre-defined schema plus parsing code that will extract meta data from files’ contents.

Motivated by that find, I created the following structure in VB.NET from scratch:

  • Retrieving a set of text files based on a search criteria from the file system – e.g. for creating the menu from all pages, or for searching for one specific file that should represent the current page – current as per the URL the user entered.
  • Code for parsing a text file for lines having a [name]: [value] structure
  • Processing nice URL entered by the user to make the web server pick the correct text file.

Speaking about URLs, so-called ASP.NET Routing came in handy: Before, I had used a few folders whose default page redirects to an existing page (such as /heatpump/ redirecting to /somefolder/heatpump.asp). Otherwise my URLs all corresponded to existing single files.

I use a typical blogging platform’s schema with the new site: If users enters

/en/2015/09/15/some-cool-article/

the server accesses a text text file whose name contains language, year, such as:

2015-09-15_en_some-cool-article.txt

… and displays the content at the nice URL.

‘Language’ is part of the URL: If a user with a German browsers explicitly accesses an URL starting with /en/ , the language is effectively set to English. However, If the main page is hit, I detect the language from the header sent by the client.

I am not overly original: I use two categories of content – posts and pages – corresponding to text files organized in two different folders in the file system, and following different conventions for file names. Learning from my experience with hand-crafted menu pages in this this blog here, I added:

  • A summary text included in the file, to be displayed in a list of posts per category.
  • A list of posts in a single category, displayed on the category / menu page.

The category is assigned to the post simply as part of the file name; moving a post to another category is done by renaming it.

Since I found that having to add my Google+ posts to just a single Collection was a nice exercise I limit myself to one category per post deliberately.

Having built all the required search patterns and functions for creating lists of posts or menus or recent posts, or for extracting information from specific pages as the current or the corresponding page in the other language …  I realized that I needed a better and clear-cut separation of a high-level query for a bunch of attributes for any set of files meeting some criteria from the lower level doing the search, file retrieval, and parsing.

So why not using genuine SQL commands at the top level – to be translated to file searches and file content parsing on the lower level?

I envisaged building the menu of all pages e.g. by executing something like

SELECT title, url, headline from pages WHERE isMenu=TRUE

and creating the list of recent posts on the home page by running

SELECT * FROM posts WHERE date_created < [some date]

This would also allow for a smooth migration to an actual relational database system if the performance of file-based database would not be that great after all.

I underestimated the efforts of ‘building your own database engine’, but finally the main logic is done. My file system recordset class has this functionality (and I think I finally got the hang of classes and objects):

  • Parse a SQL string to check if it is well-formed.
  • Split it into pieces and translate pieces to names of tables (from FROM) and list of fields (from SELECT and WHERE).
  • For each field, check (against my schema) if the field should be encoded in the file’s name of if it was part of the name / value attributes in the file contents.
  • Build a file search pattern string with * at the right places from the file name attributes.
  • Get the list of files meeting this part of the WHERE criteria.
  • Parse the contents of each file and exclude those not meeting the ‘content fields’ criteria specified in the WHERE clause.
  • Stuff all attributes specified in the SELECT statement into a table-like structure (a dataTable in .NET) and return a recordset object –  that can be queried and handled like recordsets returned by standard database queries – that is: Check for End Of File, or MoveNext, return the value of a specific cell in a column with specific name.

Now I am (re-)creating all collections of pages and posts using my personal SQL engine, In parallel I am manually sifting through old content and turning my web pages into articles. To do: The tag cloud and handling tags in general, and the generation of the RSS XML file from the database.

The new site is not publicly available yet. At the time of writing of this post, all my sites still use the old schema.

Disclaimers:

  • I don’t claim this is the best way to build a web site / blog. It’s also a fun project for the sake of having fun with developing it, exploring the limits of flat-file databases, forcing myself to deal with potential performance issues.
  • It is a deliberate choice: My hosting space allows for picking from different well-known relational databases and I have done a lot of SQL Server programming in the past months in other projects.
  • I have a licence of Visual Studio. Using only a text editor instead is a deliberate choice, too.

On Learning

Some years ago I was busy with projects that required a lot of travelling but I also needed to stay up-to-date with latest product features and technologies. When a new operating system was released a colleague asked how I could do that – without having time for attending trainings. Without giving that too much thought, and having my personal test lab in mind, I replied:

I think I always try to solve some problem!

tl;dr – you can skip the rest as this has summed it all up.

About one year ago I ‘promised’ to write about education, based on my experiences as a student and as a lecturer or trainer. I haven’t done so far – as I am not sure if my simplistic theory can be generalized.

There are two very different modes of learning that I enjoy and consider effective:

  1. Trying to solve some arbitrary problem that matters to me (or a client) and starting to explore the space of knowledge from that angle.
  2. Indulging in so-called theory seemingly total unrelated to any practical problem to be solved.

Mode 2 was what I tried to convey in my post about the positive effects of reading theoretical physics textbooks in the morning. The same goes for cryptography.

I neither need advanced theoretical physics when doing calculations for heat pump systems, nor do I need the underlying math and computer science when tweaking digital certificates. When I close the theory books, I am in mode 1.

In the last weeks that mode 1 made me follow a rather steep learning curve with respect to database servers and SQL scripts. I am sure I have made any possible stupid mistake when exploring all the options. I successfully killed performance by too much nested sub-queries and it took me some time to recognize that the referral to the row before is not as straight-forward as in a spreadsheet program. One could argue that a class on database programming might have been more effective here, and I cannot prove otherwise. But most important for me was: I finally achieved what I wanted and it was pure joy all the way. I am a happy dilettante perhaps.

I might read a theoretical book on data structures and algorithms someday and let it merge with my DIY tinkering experience in my subconsciousness – as this how I think those two modes work together.

As for class-room learning and training, or generally learning with or from others, I like those ways best that cater to my two modes:

I believe that highly theoretical subjects are suited best for traditional class-room settings. You cannot google the foundations of some discipline as such foundations are not a collection of facts (each of them to be googled) but a network of interweaving concepts – you have to work with some textbook or learn from somebody who lays out that network before you in a way that allows for grasping the structure – the big picture and the details. This type of initial training also prepares you for future theoretical self-study. I still praise lectures in theoretical physics and math I attended 25 years ago to the skies.

And then there is the lecturer speaking to mode 2: The seasoned expert who talks ‘noted from the field’. The most enjoyable lecture in my degree completed last year was a geothermal energy class – given by a university professor who was also the owner of an engineering consultancy doing such projects. He introduced the theory in passing but he talked about the pitfalls that you would not expect from learning about best practices and standards.

I look back on my formal education(s) with delight as most of the lectures, labs, or projects were appealing to either mode 1 or mode 2. In contrast to most colleagues I loved the math-y theory. In projects on the other hand I had ample freedom to play with stuff – devices, software, technology – and to hone practical skills, fortunately without much supervision. In retrospect, the universities’ most important role with respect to the latter was to provide the infrastructure. By infrastructure I mean expensive equipment – such as the pulsed UV lasers I once played with, or contacts to external ‘clients’ that you would not have had a chance to get in touch otherwise. Two years ago I did the simulations part of a students’ group project, which was ‘ordered’ by the operator of a wind farm. I brought the programming skills to the table – as this was not an IT degree program –  but I was able to apply them to a new context and learn about the details of wind power.

In IT security I have always enjoyed the informal exchange of stories from the trenches with other experienced professionals – this includes participation in related forums. Besides it fosters the community spirit, and there is no need to do content-less ‘networking’ of any other sort. I have just a few days of formal education in IT.

But I believe that your mileage may vary. I applied my preferences to my teaching, that is: explaining theory in – probably too much – depth and then jumping onto any odd question asked by somebody and trying something out immediately. I was literally oscillating between the flipchart and the computer with my virtual machines – I had been compared to a particle in quantum mechanics whose exact location is unknown because of that. I am hardly able to keep to my own agenda even if I had been given any freedom whatsoever to design a lecture or training and to write every slide from scratch. And I look back in horror on delivering trainings (as an employed consultant) based on standardized slides not to be changed. I think I was not the best teacher for students and clients who expected well organized trainings – but I know that experts enjoyed our jam sessions formerly called workshops.

When I embarked on another degree program myself three years ago, I stopped doing any formal teaching myself – before I had given a lecture on Public Key Infrastructure for some years, in a master’s degree program in IT security. Having completed my degree in renewable energy last year I figured that I was done now with any formal learning. So far, I feel that I don’t miss out on anything, and I stay away from related job offerings – even if ‘prestigious’.

In summary, I believe in a combination of pure, hard theory, not to be watered down, and not necessarily to be made more playful – combined with learning most intuitively and in an unguided fashion from other masters of the field and from your own experiments. This is playful no matter how often you bang your head against the wall when trying to solve a puzzle.

Physics book from 1895

A physics book written in 1895, a farewell present by former colleagues in IT – one the greatest gifts I ever got. My subconsciousness demands this is the best way to illustrate this post. I have written a German post on this book which will most likely never be translated as the essence of this post are quotes showing the peculiar use of the German language which strikes the modern reader quite odd.

How to Introduce Special Relativity (Historical Detour)

I am just reading the volume titled Waves in my favorite series of ancient textbooks on Theoretical Physics by German physics professor Wilhelm Macke. I tried to resist the urge to write about seemingly random fields of physics, and probably weird ways of presenting them – but I can’t resist any longer.

There are different ways to introduce special relativity. Typically, the Michelson-Morely experiment is presented first, as our last attempt in a futile quest to determine to absolute speed in relation to “ether”. In order to explain these results we have to accept the fact that the speed of light is the same in any inertial frame. This is weird and non-intuitive: We probably can’t help but compare a ray of light to a bunch of bullets or a fast train – whose velocity relative to us does change with our velocity. We can outrun a train but we can’t outrun light.

Michelson–Morley experiment

The Michelson–Morley experiment: If light travels in a system – think: space ship – that moves at velocity v with respect to absolute space the resulting velocity should depend on the angle between the system’s velocity and the absolute velocity. Just in the same way as the observed relative velocity of a train becomes zero if we manage to ride besides it in a car driving at the same speed as the train. But this experiments shows – via non-detected interference of beam of alleged varying velocities – that we must not calculate relative velocities of beams of light. (Wikimedia)

Yet, not accepting it would lead to even more weird consequences: After all, the theory of electromagnetism had always been relativistically invariant. The speed of light shows up as a constant in the related equations which explain perfectly how waves of light behaves.

I think the most straight-forward way to introduce special relativity is to start from its core ideas (only) – the constant speed of light and the equivalence of frames of reference. This is the simplicity and beauty of symmetry. No need to start with trains and lightning bolts, as Matthew Rave explained so well. For the more visually inclined there is an ingenious and nearly purely graphical way, called k-calculus (that is however seldom taught AFAIK – I had stumbled upon it once in a German book on relativity).

From the first principles all the weirdness of length contraction and time dilation follows naturally.

But is there a way to understand it a bit better though?

Macke also starts from the Michelson-Morely experiment  – and he adds the fact that it can be “explained” by Lorentz’ contraction hypothesis: Allowing for direction-dependent velocities – as in “ether theory” – but adding the odd fact that rulers contract in the direction of the unobservable absolution motion makes the differences the rays of light traverse go away. It also “explains” time dilatation if you consider your typical light clock and factor in the contraction of lengths:

Light clock

The classical light clock: Light travels between two mirrors. When it hits a mirror it “ticks”. If the clock moves relatively to an observer the path to be traversed between ticks appears to be longer. Thus measurement of time is tied to measurement of spatial distances.

However, length contraction could be sort of justified by tracing it back to the electromagnetic underpinnings of stuff we use in the lab. And it is the theory of electromagnetism where the weird constant speed of light sneaks in.

Contraction can be visualized by stating that like rulers and clocks are finally made from atoms, ions or molecules, whose positions are determined by electromagnetic forces. The perfect sphere of the electrostatic potential around a point charge would be turned into an ellipsoid if the charge starts moving – hence the contraction. You could hypothesize that only “electromagnetic stuff” might be subject to contraction and there might be “mechanical stuff” that would allow for measuring true time and spatial dimensions.

Thus the new weird equations about contracting rulers and slowing time are introduced as statements about electromagnetic stuff only. We use them to calculate back and forth between lengths and times displayed on clocks that suffer from the shortcomings of electromagnetic matter. The true values for x,y,z,t are still there, but finally inaccessible as any matter is electromagnetic.

Yes, this explanation is messy as you mix underlying – but not accessible – direction-dependent velocities with the contraction postulate added on top. This approach misses the underlying simplicity of the symmetry in nature. It is a historical approach, probably trying to do justice to the mechanical thought experiments involving trains and clocks that Einstein had also used (and that could be traced back to his childhood spent basically in the electrical engineering company run by his father and uncle, according to this biography).

What I found fascinating though is that you get consistent equations assuming the following:

  • There are true co-ordinates we can never measure; for those Galileian Transformations remain valid, that is: Time is the same in all inertial frames and distances just differ by time times the speed of the frame of reference.
  • There are “apparent” or “electromagnetic” co-ordinates that follow Lorentz Transformations – of which length contraction and time dilations are consequences.

To make these sets of transformations consistent you have to take into account that you cannot synchronize clocks in different locations if you don’t know the true velocity of the frame of reference. Synchronization is done by placing an emitter of light right in the middle of the two clocks to be synchronized, sending signals to both clocks. This is correct only if the emitter is at rest with respect to both clocks. But we cannot determine when it is at rest because we never know the true velocity.

What you can do is to assume that one frame of reference is absolutely at rest, thus implying that (true) time is independent of spatial dimensions, and the other frame of reference moving in relation to it suffers from the problem of clock synchronization – thus in this frame true time depends on the spatial co-ordinates used in that frame.

The final result is the same when you eliminate the so-called true co-ordinates from the equations.

I don’t claim its the best way to explain special relativity – I just found it interesting, as it tries to take the just hypothetical nature of 4D spacetime as far as possible while giving results in line with experiments.

And now explaining the really important stuff – and another historical detour in its own right

Yes, I changed the layout. My old theme, Garland, had been deprecated by wordpress.com. I am nostalgic – here is a screenshot –  courtesy to visitors who will read this in 200 years.

elkement.wordpress.com with theme Garland

elkement.wordpress.com using theme Garland – from March 2012 to February 2014 – with minor modifications made to colors and stylesheet in 2013.

I had checked it with an iPhone simulator – and it wasn’t simply too big or just “not responsive”, the top menu bar boundaries of divs looked scrambled. Thus I decided the days of Garland the three-column layout are over.

Now you can read my 2.000 words posts on your mobile devices – something I guess everybody has eagerly anticipated.

And I have just moved another nearly 1.000 words of meta-philosophizing on the value of learning such stuff (theory of relativity, not WordPress) from this post to another draft.

Non-Linear Art. (Should Actually Be: Random Thoughts on Fluid Dynamics)

In my favorite ancient classical mechanics textbook I found an unexpected statement. I think 1960s textbooks weren’t expected to be garnished with geek humor or philosophical references as much as seems to be the default today – therefore Feynman’s books were so refreshing.

Natural phenomena featured by visual artists are typically those described by non-linear differential equations . Those equations allow for the playful interactions of clouds and water waves of ever changing shapes.

So fluid dynamics is more appealing to the artist than boring electromagnetic waves.

Grimshaw, John Atkinson - In Peril - 1879

Is there an easy way to explain this without too much math? Most likely not but I try anyway.

I try to zoom in on a small piece of material, an incredibly small cube of water in a flow at a certain point of time. I imagine this cube as decorated by color. This cube will change its shape quickly and turn into some irregular shape – there are forces pulling and pushing – e.g. gravity.

This transformation is governed by two principles:

  • First, mass cannot vanish. This is classical physics, no need to consider the generation of new particles from the energy of collisions. Mass is conserved locally, that is if some material suddenly shows up at some point in space, it had to have been travelling to that point from adjacent places.
  • Second, Newton’s law is at play: Forces are equal to a change momentum. If we know the force acting at time t and point (x,y,z), we know how much momentum will change in a short period of time.

Typically any course in classical mechanics starts from point particles such as cannon balls or planets – masses that happen to be concentrated in a single point in space. Knowing the force at a point of time at the position of the ball we know the acceleration and we can calculate the velocity in the next moment of time.

This also holds for our colored little cube of fluid – but we usually don’t follow decorated lumps of mass individually. The behavior of the fluid is described perfectly if we know the mass density and the velocity at any point of time and space. Think little arrows attached to each point in space, probably changing with time, too.

Aerodynamics of model car

Digesting that difference between a particle’s trajectory and an anonymous velocity field is a big conceptual leap in my point of view. Sometimes I wonder if it would be better to not learn about the point approach in the first place because it is so hard to unlearn later. Point particle mechanics is included as a special case in fluid mechanics – the flowing cannon ball is represented by a field that has a non-zero value only at positions equivalent to the trajectory. Using the field-style description we would say that part of the cannon ball vanishes behind it and re-appears “before” it, along the trajectory.

Pushing the cube also moves it to another place where the velocity field differs. Properties of that very decorated little cube can change at the spot where it is – this is called an explicit dependence on time. But it can also change indirectly because parts of it are moved with the flow. It changes with time due to moving in space over a certain distance. That distance is again governed by the velocity – distance is velocity times period of time.

Thus for one spatial dimension the change of velocity dv associated with dt elapsed is also related to a spatial shift dx = vdt. Starting from a mean velocity of our decorated cube v(x,t) we end up with v(x + vdt, t+dt) after dt has elapsed and the cube has been moved by vdt. For the cannon ball we could have described this simply as v(t + dt) as v was not a field.

And this is where non-linearity sneaks in: The indirect contribution via moving with the flow, also called convective acceleration, is quadratic in v – the spatial change of v is multiplied by v again. If you then allow for friction you get even more nasty non-linearities in the parts of the Navier-Stokes equations describing the forces.

My point here is that even if we neglect dissipation (describing what is called dry water tongue-in-cheek) there is already non-linearity. The canonical example for wavy motions – water waves – is actually rather difficult to describe due to that, and you need to resort to considering small fluctuations of the water surface even if you start from the simplest assumptions.

The tube

In Praise of Textbooks with Tons of Formulas (or: The Joy of Firefighting)

I know. I am repeating myself.

Maurice Barry has not only recommended Kahneman’s Thinking, Fast and Slow to me, but he also runs an interesting series of posts on his eLearning blog.

These got mixed and entangled in my mind, and I cannot help but returning to that pet topic of mine. First, some statistically irrelevant facts of my personal observations – probably an example of narrative fallacy or mistaking correlation for causation:

As you know I had planned to reconnect to my roots as a physicist for a long time despite working crazy schedules as a so-called corporate knowledge worker. Besides making the domain subversiv.at mine and populating it with content similar to the weirdest in this blog I invented my personal therapy to deflect menacing burn-out: I started reading or better working with my old physics textbooks. Due to time constraints I sometimes had to do this very early in the morning – and I am not a lark. I have read three books on sleep research recently – I know that both my sleep duration as well as my midsleep are above average and I lived in a severely sleep-deprived state most of my adult life.

Anyway, the point was: Physics textbooks gave me some rehash of things I had forgotten and prepared me to e.g. work with the heat transfer equation again. But what was more important was: These books transformed my mind in unexpected ways. Neither entertaining science-is-cool pop-sci books nor philosophical / psychological books about life, the universe and everything could do this for me at that level. (For the records: I tried these to, and I am not shy to admit I picked some self-help books also. Dale Carnegie, no less.)

There were at least two positive effects – I try to describe them in my armchair psychologist’s language. Better interpretations welcome!

Concentrating and abstract reasoning seems to be effective in stopping or overruling the internal over-thinking machine that runs in circles if you feel trapped in your life or career. Probably people like me try to over-analyze what has to be decided intuitively anyway. Keeping the thinking engine busy lets the intuitive part do its work. Whatever it was – it was pleasant, and despite the additional strain on sleep and schedule it left me more energetic, more optimistic, and above all more motivated and passionate about that non-physics work.

I also found that my work related results – the deliverables as we say – improved. I have been the utmost perfectionist ever since and my ability to create extensive documentation in parallel to doing the equivalent of cardiac surgery to IT systems is legendary (so she says in her modest manner). Nevertheless, plowing through tensor calculus and field equations helps to hone these skills even more. For those who aren’t familiar with that biotope: The mantra of other Clint-Eastwood-like firefighters is rather: Real experts don’t provide documentation!

I would lie if I would describe troubleshooting issues with digital certificates as closely related to theoretical physics. You can make some remote connections between skills that sort of related such as cryptography is math after all, but I am not operating at that deep mathematical level most of the time. I rather believe that anything rigorous and mathy puts your mind – or better its analytical subsystem – in a advanced state. Advanced refers to the better prepration to tackle a specific class of problems. The caveat is that you lose this ability if you stop reading textbooks at 4:00 AM.

Using Kahneman’s terminology (mentioned briefly in my previous post) I consider mathy science the ultimate training for system 2 – your typically slow rational decision making engine. It takes hard work and dedication at the beginning to make system 2 work effortless in some domains. In my very first lecture at the university ever the math professor stated that mathematics will purge and accelerate your brain – and right he was.

Hence I am so skeptical about joyful learning and using that science-is-cool-look-at-that-great-geeky-video-of-blackholes-and-curved-space approach. There is no simple and easy shortcut and you absolutely, positively have to love the so-called tedious work you need to put in. You are rewarded later with that grand view from the top of the mountain. The ‘trick’ is that you don’t consider it tedious work.

Kahneman is critical of so-called intuition – effortless intuitive system 1 at work – and he gives convincing accounts of cold-hearted algorithms beating humans, e.g. in picking the best candidate for a job. However, he describes his struggles with another school of thought of psychologists who are wary of algorithms. I have scathed dumb HR-acronym-checking-bots at this blog, too. But Kahneman finally reached an agreement with algorithm haters as he acknowledged that there is a specific type of expert intuition that appears like magic to outsiders. His examples: Firefighters and nurses who feel what is wrong – and act accordingly – before they can articulate it. He still believes that picking stocks or picking job applicants is not a skill and positive results don’t correlate at with skill but are completely random.

I absolutely love the example of firefighters as I can literally relate to it. Kahneman demystifies their magic abilities though as he states that this is basically pattern recognition – you have gathered similar experience, and after many years of exposure system 1 can draw from that wealth of patterns unconsciously.

Returning to my statistically irrelevant narrative this does still not explain completely why exposure to theoretical physics should make me better at analyzing faulty security protocols. Physics textbooks make you an expert in solving physics textbook problems, this is: in recognizing patterns and provide you with ideas of that type of out-of-the-box idea you sometimes need to find a clever mathematical proof. You might get better in solving that physics puzzles people enjoy sharing on social media.

But probably the relation to troubleshooting tech problems is very simple and boils down to the fact that you love to tackle formal, technical problems again and again even if many attempts are in vain. The motivation and the challenge is in looking at the problem as a black box and trying to find a clever way to get in. Every time you fail you learn something nonetheless, and that learning is a pleasure in its own right.

DOD mobile aircraft firefighting training device