13 questions you might ask an IT company in interview

Reading Time: < 1 minute
  1. What to expect on a Monday morning at 07:30AM in the office ?
  2. How much per head the company puts to Perks? (Tangible employee benefits, excl. insurance etc)
  3. What’s the average salary of workforce?
  4. What’s the top 10% making in the company and what is their role?
  5. Why people do remote in your company, give 3 reasons you suspect are the case
  6. How much (% of efficient workdays) are remote de facto?
  7. HOW does your company grow?
  8. Name 2 big financially relevant milestones from last year, that your company achieved?
  9. Do you intend to take over the world in your own segment?
  10. Who owns your company?
  11. What are the biggest reasons that your employees come to work on-site, if they have remote as option?
  12. If profits rise 50% this year, how is it reflected in the average salary?
  13. Are the answers you gave “data-backed”, ie. factual?

 

Atom, editorien aatelia?

Reading Time: 3 minutes

Atom on jännä editori. Joillekin, kuten arvata saattaa, editorin valinta on pyhä asia. Vähän kuin se oma suosikki lätkäjoukkue!

Editorit ovat keskeisimpiä työkaluja monelle “asiakaskunnalle”, joten niihin liittyy dogmaattista kiihkoa. Sanon ihan suoraan, että niin kauan kuin olen itse koodaillut, editorit ovat olleet aina keskeinen kiihkoilun ja lahkoilun keskiöitä. Erilaisia vaihtoehtoja on tullut (myös mennyt). Qedit, Emacs, vim, Sublime Text, monet eri Notepad:it, ja nyt esimerkiksi Atom. Unohdin varmaan tästäkin tusinan kohtuullisen tuttua editoria. Kaikenkaikkiaan tekstieditoreita on luultavasti ainakin tuhatkunta.

Aikoinaan editoreihin tuli tehtyäkin kaikenlaista pientä, ja jopa kokonaisia yksinkertaisia editoreita koodattiin ihan puhtaalta pöydältä. Kun paljolti tekstipohjaisessa MS-DOSissa ei voinut kovin paljoa vaikuttaa suoraan editorin käyttämän fontin ulkonäköön, kääräisimme hakkerituttavan kanssa assemblerilla taustaohjelman, joka pakotti fontin uuteen haluaamme uskoon (SFONT). Kylkiäisiksi tietysti tarvittiin apuohjelma, jolla fontteja oli kiva suunnitella – siis pieni piirto-ohjelma. Itse syntyi sekin.

Tosin kokonainen tekstieditori on sen verran monimutkainen ohjelma, että mitenkään kädenkäänteessä tällaista uutta ei tule tehtyä. Siksi Atom ja moni muu hyvä editori on open source -hanke, jossa on mukana vähintään useampi keskeinen kehittäjä.

Sublime Text sen sijaan syntyi perimätiedon mukaan siten, että varsin mukavassa positiossa Googlella työskennellyt koodaaja sai vision ja mission tehdä maailman paras tesktieditori. Visio vei mukanaan, hän lopetti työt Googlella ja on käsittääkseni siitä asti toteuttanut unelmaansa. Sublimen ympärille on syntynyt myös rikas ekosysteemi, ‘packages’ eli paketit sallivat editorin laajentamisen samaan tyyliin kuin Atomissa.

Tuolle aiemmin mainitulle MS-DOSille muuten löytyi tosi tehokkaita puhtaita tekstieditoreita sekä IDEjä. Kaikki merkkipohjaista, joka on tietyllä tapaa uudestaan ‘in’ –– samoin kuin oman editorin räätälöinti. “Your mileage may vary” – osa haluaa päästä nopeasti hommiin, ilman kustomointeja; toiset ovat sitä mieltä että editorin pitää tuntua kotoisalta ja olla tehokas kaikkine virityksineen.

Editorin loputon virittely varsinkin projektin alkuvaiheessa voi olla tuskainenkin kokemus joskus, varsinkin jos itse projektin koodin tekeminen alkaa jo polttelemaan. Editoria kannattaa nyplätä hieman kuin bonsai-puuta – ei täyspäiväisesti.

Atomin synty

No niin, pakollinen hirnahdus alkuun:

Q-taso on ytimestä katsottuna etäisin, jossa minkään nykyisin tunnetun alkuaineen atomilla on perustilassa elektroneja.

Anteeksi. Oikea konteksti alla:

Atom ja Electron sinänsä ovat softapuolellakin pari.

Palataan askel historiaan:

Aivan alussa oli jotain epämääräistä pitkätukkapsykedeelihippiä 1960-luvulla. Internetiä ei vielä ollut. No pian oli.

Sitten tuli kasarikorporaatiokoodaajat; nämä eivät kiinnostaneet ketään. You know: diskettejä, myöhemmin kovalevyt; PC:eitä, kirjanpitoa, toimistohiki, vesiputousmalli ja liukuhihnakoodaus; nipin napin 16 väriä, winkkari. Kasarikorporaatiokaudesta on kuitenkin syntynyt eräs eeppinen teos: Mike Judgen elokuva “Office Space” (Konttorirotat).

Sen jälkeen tuli IT-buumi vuosituhannen vaihteessa, jonka jälkeen kultainen hipsterismin aikakausi. Hipsterismin aikakauteen liittyy myös Githubin synty. Hipsterismin periaatteisiin kuuluu: jos voit modata jotain, modaa. Jos jonkun voi kirjoittaa Javascriptillä tai sen johdannaisella, tee se!

Kiva idea: yksi koodikanta, Electron hoitaa loput

Electron tarjoaa alustariippumattoman kuoren ohjelmille. Atom on rakennettu oikeastaan aika puhtaasti webistä tutuilla jauhoilla: HTML, CSS, Javascript. Tuo pyörii omassa “koneessaan” eli prosessissa. Lopputulos Electronin avulla näyttää samalta eri alustoilla. Etuna siis klassinen: ei tarvitse ylläpitää niin montaa koodikantaa kuin on eri käyttöjärjestelmiä. Vastapainona tälle on se, että Chromen päälle rakennetun Electronin “paino” on aika suuri. Pienikin tyhjä sandbox-prosessi on yli 100 megatavun kokoinen pala RAM-muistia. Tosin, eipä tuo haittaa, monestikaan, koska RAMia on kuitenkin yleensä kaikissa koneissa tarjolla vapaana riittävästi. Aivan pienimmissä 2 GB:n keskusmuistin koneissa voi olla merkitystä sillä, paljonko ohjelmat vievät.

Atomin historia on itsessään mielenkiintoinen. Se on Github:n porukan alullesaattama, Github:n “virallinen editori”. Vapaus ja muokattavuus ovat olleet esillä voimakkaasti. Atom on teknisesti rakennettu Electron-shellin päälle, jolloin saadaan alustariippumattomuus. Electron on hieman kuin Java grafiikkakirjastoineen, tai C++ ja Qt. Idea kaikissa cross-platform silloissa on se, että yhden kerran hyvin tehty tarjoama vähentää huomattavasti sillan päällä ajettavan koodin sovitustyötä. Tällöin siis kerrostetaan kehitystä: luotetaan siihen, että “joku hoitaa yksityiskohdat”.

Electron toi alkuun myös penaltya nopeudessa; Atom ei ollut tunnettu nopeudestaan. Alkuaikoina editori myös kaatuili varsin usein. Näistä lastentaudeista on päästy huomattavasti eteenpäin. (Ajan tällä hetkellä 1.24.1 versiota ja en muista nyt suoraan että editori olisi viime aikoina kaatunut)

Entä päivitys? Helppo!

Atomin päivitys ainakin Linuxissa on varsin suoraviivainen prosessi. Uusi .deb paketti ajetaan dpkg -i :llä vanhan päälle – that’s it. Homma on hoidettu! Olemassaolevat asetukset ja kustomoinnit säilyvät, kuten pitääkin.

Kaikki Atomin kustomointi on tallennettu hakemistoon:

~/.atom

 

Leaps, small and big problems

Reading Time: 5 minutes
Computers are about wishful thinking. It sounds perhaps a ridiculous and odd, but it’s so true.
Basically the thing is so that
  • world has an infinite supply of little and big problems that can be solved
  • computers can be used in probably most of these cases
  • software nor devices alone rarely solve real problems, per se – change in the organization, attitudes and learning is required to some extent

Envisioning

Often computer scientists, programmers, consultants come across situations that are highly exciting – blasts of thoughts – which leave behind a settling dust of craving.

Computer science can be like maths; the little that I know of both.

First you have a curious tough nut, a problem to be cracked:

  • traffic congestion near and within metropolitan cities
  • invoice handling
  • communications between people
  • storage of video image
  • understand the brain in clinical research
  • the role of DNA in our fate
  • making marketing effective, smart and unobtrusive
  • seeking truth (scientific method)
  • writing a good book, quickly (in less than 30 days)

As the physical resources grow in computing, algorithmically speaking space and compute limits still exist. We don’t live in a panacea of computing, not yet. Perhaps basically never.

It might be that at some point we might have a scalable production automation of intelligence, that can grow the growth rate of the intelligence, which… Where this would lead us, is an open question. Some call this kind of scenario the “singularity”. Perhaps. It might be that I’m mistaken with this. Check out Kurzweil’s book “The Singularity is Near: When Humans Transcend Biology”. [at Amazon]

Expectations of a perfect world?

It might be that we never will be living in a “perfect world”, even what comes to information needs.

The 1990s and early 2000 optimism about effects of the World Wide Web have probably waned somewhat due to the ‘fake news’ allegations surfacing all over the world after a historically ugly presidential elections in United States (Trump vs. Clinton).

Needless to say, those that have witnessed the evolution of the Web, did know even before an election that in the Internet, nobody knows even if you’re a dog. Internet teaches (or should teach) the crucial skill of being critical towards any news source, in a healthy way. “Not everything you see online is true.”

Some human touch needs to be input to the equation of coming up and providing better services.

When the economy heats up, and computers and all the IT are hot again, things happen. There’s a lot of things that start to happen.

Hype is a strange phenomena. World has had hype probably since inception of our brains – from as early on as we have existed. Think about a tribe, far far back in time, who discovers a more habitable environment – plenty of fruit, perhaps, fresh water, aesthetically pleasing view, fresh air, maybe other tribes.

Nowadays hype creation and sustainaining is just more controlled, pervasive, digital.

Amazon’s Jeff Bezos had a really good comparison between the 2000s IT hype and the American gold rush, in California. Both phenomena have similar features: people start seeking immediate and almost inexhaustible sources of wealth.

Ramifications of the people seeking wealth

I’m actually almost oblivious to the wealth side. Of course it “would be nice”, but I’m far more interested in what the chase for wealth indirectly causes.

Or chase for leisure and pleasure, for that matter. Perhaps the latter is what actually drives people. We’re lazy. We want to enjoy life! And that’s ok.

In around the year 2000 there are innumerable stories of people switching completely from another profession to IT-related professions. During the heydays of American gold rush, people from all trades (doctors and lawyers included) left their own cities and headed west to California. Yet, very few got rich. Those that made it, also made the headlines: news spread of the positive side of digging one’s own fortune. The gold rush had a significant effect on the society and economics though since it created a lot of economic activity.

In entrepreneurship, you have an average chance of around 90% to fold (go out of business). This doesn’t keep people from establishing companies. That’s a good thing. Entrepreneurship is about the relentless and optimistic way of thinking, in addition to feet-on-the-ground sanity.

Realization: Now is the “future”

In May 2018 I merged two blogs.

Douglas Coupland being interviewed by Channel 4
author Douglas Coupland (left) interviewed on Channel 4

By importing around 100+ blog posts from “Psiic”, which was hosted on Google’s blogspot service, I also got some raw draft articles. This very post you’re looking at right now is made from one of those drafts. I believe it originates from around 2005-2006, or somewhere on the ballpark. Anyways, it’s somehow quite interesting that even though we’re basically talking a 10+ year old ideas, some of the content of what I talk about in the post has only now been turned into mainstream.

I’ve followed the birth of home computing (my podcast on it), and keep looking into the future. As we go along, it’s really interesting to see how things turn out. There are some obvious things that will happen.

I’m a big fan of simplifying the pack of gear one has to carry around. I’m totally hopeless with trying to keep things in my backpack.

It’s hard (..if not well equipped)

Yet I can’t easily let go of things, ie. when I’m going somewhere, I want to be well equipped: having all input methods (a mouse), the mouse pad, a backup battery preferably for the mobile phone – you can imagine the rest. Some perfectionism, definitely. With its ups and downs. Don’t get me wrong: I’d love to just wear stylish virtual reality goggles. And go about doing my business. But I’m not willing to sacrifice a lot of the power computing idea; the idea that I would be working with a sub-optimal setting.

“The virtual reality goggles scenario is perhaps still a good 5 to 10 years in the future”, I wrote back in 200-2006. As I’m looking at the world now in 2018 – well, people might be stumbling in the streets, but it’s because their eyes are zoomed in on the mobile phone, not because they’d be living in augmented or virtual reality.

Things like these move onward at a surprisingly slow pace, after all.

Wireless charging – great innovation

A Finnish entrepreneur came up with a very nice invention, the wireless charging for mobile phones. You’d think that it’s a no-brainer: it’s an option that would quickly permeate all over the world – I mean, who loves to carry a mobile charger around, find a suitable place and mains socket to have at least 20 minutes of uninterrupted charging – and remember: the chargers is always entangled in a chimera that takes around 10-30 seconds to unwind. It’s MESSY! Well, wireless charging hasn’t permeated the world. I bet that the number of wired charger sales has more than linearly followed the amount of mobile phones sold.

Batteries and the energy question overall, and also the very act of recharging a mobile phone’s battery is one of the Achilles’ heels of the mobiles.

Termed “Powerkiss”, the wireless charging technology along with the company was bought by a big corp. Then it almost vanished – at least to my experience. Well, turns out that when I investigated the matter further, it hadn’t vanished. Indeed many cafes around the world use this technology to enhance their services for customers.

But what’s particularly evident in the wireless charging, is of course- competing and partially incompatible standards; ways of actually making the details of the wireless charging on two ends: the phone and the charger pad. As technologist and open source advocate I’m really in a bad position to balk at competition; as a simple human and customer, oh how much I’d love simple things!

Yoast -lisäosa Jukkasoftin blogiin

Reading Time: < 1 minute

Toteutin pitkäaikaisen unelman: WordPress-blogin Yoast SEO -lisäosa käyttöön. Yoastin on sanottu olevansa alansa parhaimmistoa. Yoast sisältää paljon toiminnallisuutta. Se tekee paljon asioita automaattisesti, ja toimii mukavan opasteellisena apurina koko ajan blogiartikkelin kirjoittamisvaiheessa. Yoastin voi nyt Gutenberg-editorissa myös pitää kätevästi hyvin passiivisena, jos ei halua jatkuvasti katsella neuvoja.

Kuten kaikki tehokas teknologia, Yoast vaatii hieman opettelua. Suosittelen esimerkiksi tämän sivun lukemista.

Yoast käytännössä mm. helpottaa sivuston teknisen rakenteen ylläpitoa niin, että blogi on paremmin hakukoneiden ymmärrettävissä. fcde14bc-36f2-417e-bdca-c1b8dc25042a-713-0000004c8eef532c_file.jpgSEO (search engine optimization) on iso kirjo erilaisia asioita, jotka käytännössä vaativat sekä funtsaamista että raakaa työtä.

Tutustun lähiaikoina lisää Yoastiin, ja mielenkiintoisten piirteiden ilmestyessä pistän uutta artikkelia tänne!

Smartphones inside out – part 2: mobile networks

Reading Time: 14 minutes

I wrote a post about mobile phones a “bit” earlier. This is the follow-up, with mobile networks. They are, inevitably, something that the whole mobile culture relies on. Yet I think mobile networks have received quite little interest from general public.

5G is a hot topic right now in 2018. The generations of mobile networks mean that there is an incremental continuation in improving (often) the speed, coverage and generally the capacilities of the mobile network. 5G has been a long time in the waiting. There are stepping stones to it, from the current 4G, so the change is not a “turn of a switch”.

Let’s get back to basics.

I was fascinated with the mobile masts early on. img_8169I’m not “into them” so much nowadays, but remember scaling one mast and checking it out myself as a teen.

During studies I was part of a local radio amateur club (OH2TI) for a couple of years in Otaniemi, 2005 and 2006 (or thereabouts). This was much better way to get familiarized with masts, radio technology, electronics, and people.

Mobile networks are the almost invisible part of our mobile culture. Without them, there wouldn’t be “mobility” at all – we’d just be carrying phones that could not seamlessly connect to other phones, servers, and landline phones. Proliferation of the network’s components has led to better coverage and a faster network. The support structure in densely populated cities is different from rural areas. However the ingenuity of a mobile network is that it unifies the layer so that users feel as if the network is magically omnipresent.

Looking back now in 2017 to the roots of telecommunication, it’s easy to almost forget what a long journey had to be taken to get the quality of mobile networks available today.

As the story of Nokia Mobile Phones has somewhat waned from public limelight, the Nokia (network company) goes on strong.  tn_siilasmaaIn fact, in autumn 2018 Risto Siilasmaa is publishing a book about the current Nokia!

Network speeds in mobile world have gone through the ceiling, almost. A 50 mbit/s mobile download speed would have been pure fantasy just a decade ago.

Nowadays it’s quite evident that mobile networks are used for 2 prime purposes: transmitting speech and transmitting data. Speech used to be the sole type of payload going between a mobile phone and the network. Nowadays the roles have almost swapped: people use data, by using their smartphone apps, possibly more than they’re using talk.

SMS or short text messages was a curious “freak invention”. People could send a maximum of 160 characters from a phone to another. SMS did not technically “consume” bandwidth between the hops in mobile network, since the message payload data was transmitted in a control channel. SMS became a killer app; it is even still used a lot, even though there are a lot of “competing” applications that utilize data, and can provide a richer means to transmit icons, animations and photos along with text.

How does a mobile talk to the mobile network?

A mobile network faces the phones using radio. Radio itself is an older invention. The modern digital mobile networks always have to, nevertheless, work according to and respecting radio principles. That’s part of “why”: we need standards bodies to regulate radio traffic, otherwise an ensuing chaos would not have enabled the development of such widespread and homogenous quality adaptation of mobility.

Session between phone and mast

So, img_8170each mobile phone has a “discussion” (ongoing session) with a mast. For example our speech becomes a stream of bits: first the mobile phone digitizes the voice of its user, then sends this on wire. The “wire” happens to be radio, not a cable — but do remember, often the gap between masts is indeed a wire 🙂

In 1990s the antenna design race was a big one. There was also a settling period between whether the antenna should be visible (a whip) or embedded cleanly into the phone. The latter won.

All things are layered, so that for example a software developer does not have to know all the gross details of radio networks. Even the mobile phone manufacturer doesn’t these days have to know all the details: manufacturers can subcontract the manufacture and design of electronic chips used in the phones.

Mobile network speed

The speed comes in three different aspects: latency, bandwidth and the network coverage. A fourth aspect is jitter (the variability of quality during time).

In reality there is also a 5th aspect that is not actually about the mobile network; but rather the load of an application server. Just as in desktop computers, some sites might have inadequate capacity, considering the number of users, and thus the user experience is not the best possible.

The latency is the initial setup time of a data connection, and it is also the minimal time that it takes to get any amount of data from transmitting phone to a receiving phone (or server). Latency affects the quality of real-time communications like speech, video and gaming. Less is better. Nowadays latency as low as 5-10 milliseconds is achievable in mobile networks. When first data capabilities appeared in mobile networks, the setup might take several 1000s of milliseconds.

Why the mobile network speed is tricky?

In mobile networks you are always out there, in the wild: things are a little bit more unpredictable than those office LANs. The amount of cell towers, their interconnect network quality, temporary issues that have to do with weather, natural and built obstructions between phone and the mast; network outages, and many other things affect your perception of the quality of a mobile network.

When you buy a mobile lease, that is, sign up with a mobile operator, you’re buying a snapshot of their offering: how things stand today. That is, quality and prices can change in the future. If the operator doesn’t expand capacity as fast as it gains new customers, then on average the throughput will decline.

Sometimes bottlenecks in capacity are not directly up to the operator – they might face legislative, technical or even political reasons for not being able to change the network structure.

Evolution of mobile tech

15 years back the phone was most about 2 things: built-in features and a fixed software. The pace of development was fast, but it was in a different way: every release of a new phone model was anticipated by questions regarding what Truly New Features were packed into the make and model.

Phones were also exploring the form factor quite radically in the 1990s:whether the phone was a single-piece or had a bevel (joint) and a separate screen; how many real buttons the phone had. The design of the buttons (ordinary, 4 directions, a “navi” roller etc.).

At some point completely new things, like the digital camera, surprised the markets. Camera was an interesting thing, especially now judged afterwards. It became an essential ingredient of the smart phone.

World first in camera phones? Samsung with the model

Then further down the line, innovations became kind of more anticipated: you’d get still leaps of new exciting development, but it was incremental, not disruptive. iPhones have basically stopped evolution since iPhone 4: even the staff at Apple Stores roll their eyes when being asked about “What’s the next iPhone going to bring?”. Answer: “A little bit of this and that – nothing magic.”

Apps, apps, more apps!

Nowadays there are millions of apps on app stores. An average consumer keeps about 17 apps per phone.

Thus the competition for “attention” on a user’s phone is fierce, even though technically the limit to hold more apps is starting to vanish. People just don’t want too much clutter on that very personal real estate. Typically the choice to start an app requires some swiping left/right on the phone. The more you have apps, the longer it takes to get to the correct one. It’s a usability issue.

Today’s 1 mbit/s, 5, or even 40-50 mbit/s burst download speeds are insanely fast compared to the almost magical 9600 bit/s that initially was allocated to a phone by its “host”, the mobile tower. Today’s speeds are thus

Let’s put the Moore’s law to test. Moore can predict to the future of technological evolution. It’s an empirical formula, based on the initial observations of how gates (components) in integrated chips got smaller. This ‘minituriazation’ is the key phenomena that enables high-tech products.

Evolution of the Mobile Network

How slow or fast were things with a early “smartphone”?

When Nokia’s “Communicator“, a flagship product, hit the road in 1996, and soon became a legend, the mobile data network was quite different back then. A phone’s data connection practically rode at most on max 2 combined GSM data channels. Each data channel packed 9,6 kbit/s so the Communicator could run at a then-whopping 19200 bits/s.

Let’s take as example one digital image, around 128 kilobytes, and that would take 54 seconds to download. Impractical. Nowadays in 2018 the download time is in reality somewhere between 1 and 2 seconds. Not bad at all! Let’s say it is 25 x faster now.

By the way, taking that image download example, with 5G coming in 2018-2020 these kinds of small UX improvements are expected to get “just right” – that’s my bet. There’s also something that tech itself can improve. The way those images are being used can affect the various parts of the timing, and thus our psychological feeling of how fluently the particular service works.

Let us hop back to 1996 and the Communicator:

in practical sense, for example most emails were only pure ASCII text – something around 1-2 kilobytes per mail; thus 20 new mails could be downloaded in 18 seconds. Or a single email in about 3-5 seconds, with all the protocol set-up etc that took place when the client-server connection was established. It wasn’t exactly as fluid as you’d want, but then again: it was quite a breakthrough, a shift in the paradigm: you did not have to be sitting at your desk. You were suddenly truly “mobile”, capable of doing most parts of work anywhere. Well, anywhere within the reach of mobile networks.

The 19200 bps speed we mentioned is a tiny fraction of the speeds of nowaday’s 4G networks. It’s less than 1 percent of 4G’s speed. Imagine that!

Still the 1990s Nokia Communicator was a phenomenal success story at that time. It was iconic, and it also brought significant power to the owner.

Handover and the magic of data

A mobile phone, in order to stay useful and true to its breed, keeps a connection to its “parent” – the mobile mast. The phone basically only peeks into the world through its data connection. This is the very core phenomena of “the world is at your fingertips” -illusion: yes – and no. Only the next “hop” (a PC with a network connection) is at your phone’s fingertips. If that server machine – the “receiving end” of your mobile data connection – is not up, then you’ll take a dive into the abyss: “no connectivity”. However during times the technology has been evolving and almost 100% of the time you will have a data connection. Almost magic!

The scenario is not that much different from your laptop and the WiFi access point (“hotspot”). With mobile phones, the distances are greater, and usually there’s also a rather constant movement – thus a “handover” is expected. In handover the mobile phone leaves one network and enters another network. In many cases, this is completely automatic and the user doesn’t know a handover took place.

(By the way; depending on the design, if in the “laptop scenario” we mentioned before, you leave the range of your hotspot, you’ll lose connectivity. Any attempt to access Internet will show an error message. It is possible to design campus networks so that there are Wi-fi repeaters within the area, densely enough, so you can carry around and do work with your laptop, and it jumps from a hotspot to another as needed).

Looking at the feature list of a mobile (cellphone) sometimes makes us dizzy. The lists, as fair and accurate they might be, don’t tell the whole story. Usually just acronyms are used: WCDMA, 3G, GSM, WiFi, UMTS, EDGE, and so on. I think there’s a vast population of users to whom these words mean absolutely nothing, or, sometimes even worse, they vaguely bring into mind something which practically leads to expensive or inefficient use of the phone.

On manufacturer side, 3G and 4G networks have become de facto. The availability and quality is still an ongoing quest, worldwide, and there are huge differences in average density and speeds per country.

Without proper knowledge one can use the phone in very unoptimal way, pay too much fees, and get frustrated because of network outages. Knowledge never hurts.

GSM – the foundations of mobility

The three letters “GSM” used to be eponymous for the whole handset industry. GSM originates from a standards group (originally Groupe Spécial Mobile ), a very small one, which drafted the technology that united a back-then quite heterogeneous set of mobile network technologies. One crucial example: handover.

When you use a mobile phone in car or while walking, you cross boundaries between different base stations. A base station can be “heard” only for about a few kilometers – beyond that the signal fades off. A handover happens when two base stations agree that your mobile phone call will be continued even after crossing the boundary.

GSM is the main standard that enables worldwide mobile communications. GSM was immensely importance at its time, and still guarantees interoperability of the phones at voice and text message level.

A couple of rule of thumbs:

– know your phone’s settings: what your phone’s hardware (features) allow, what is available to be modified as settings, and where you can find them in the menus

– different network technologies have varying maximum speeds, physical ranges, and security. Use the one that is suitable for your purposes.

Analog and digital networks

I’ll leave largely the software part aside, and concentrate on those (physical) network features. The original cell phones used different kind of technology: analog networks.

These were very much the same as what radio amateurs had been using for decades: one’s voice is transferred into a different frequency, mixed with a carrier wave, and transmitted. On the other end, a reverse process happens and the listener receives the original voice.

While these were simple, analog networks had problems with security, conflicts (overhearing other people’s talk), and the inability to serve as a backbone for modern services. Thus GSM standard was born. The word ‘GSM’ came to mean also the cell phone itself. GSM is an interoperability standard which defines digital communications between phones, using mobile base stations (‘masts’).

The neat idea of digitization is that once you have that basic ‘pipe’ or connection between two phones, you can do a lot of things over it. Text messages, Web, using email, and a lot of other services became possible. A flagship product of this kind of “mobile office” was probably Nokia’s Communicator 9000.

Antennas

But as simple as we’d like to keep the digitization, the truth is a bit more complex. The basic reason behind this is that still, no matter how digital, the communications happens in the electromagnetic spectrum – in the air, using electrons. Electrons are guided by antennas. They fly through air (kind of – unfortunately; electrons would be ideally precise in void, but we would have trouble without air to breathe 😉 and are received by another antenna – the receiver.

This movement of electrons (electricity) obeys the laws of physics, which don’t give a heck about what’s going on the “upper plane”. All communications that we use in today’s world (2016) happens using some band (slice) of a frequency.

Where do WiFi, Bluetooth, 3G, 4G come in?

The thing to note first is that phone communications can be of three distinct types:

  1. autonomously between two phones
  2. phone using local wireless area network (WiFi)
  3. phone using data network (3G, 4G)
The first type of communication includes Bluetooth and infrared (IR). It’s just about “swapping” data, when the phones are relatively close to each other. The significance of phone-to-phone communications has declined, as it is plain easier to use ritually a “TCP/IP” communications method, ie the Internet. Another thing that diminishes the practicality of phone-to-phone is that large business platforms like Facebook, Google tools and a host of others all benefit most if all data goes through a central point (their servers).
However, where autonomous communication is important, will be between the mobile phone and local information systems like ticket vending, or a car’s entertainment system.
It doesn’t cost anything, because there is no operator in between. The phones usually negotiate a security code (PIN) by asking the users to agree on one. People use this kind of comms to swap addresses, copy files, and so on. The speeds vary, Bluetooth being faster than infrared. BT is also more secure and robust to interference. Also NFC (near field communications) is a type of autonomous communications, where the receiver is often a POS device at a shop.
Where does one need WiFi?
That’s actually a very neat feature, that can save you money and allow doing software installations, updates, and operating system installations easier. WiFi means that your phone essentially looks like a computer to a WiFi hotspot device: the device allows you to become part of the network, essentially Internet.
The most obvious place to have WiFi is at your home or work. Also public places like libraries, cafes, and metropolitan areas in general can have a network coverage.
WiFi cannot charge (make you pay) directly, but you may not receive access rights to WiFi network in some places, before you pay. For example, some hotels, airports, or other similar places require you to buy a coupon which gives you a username and password, or might otherwise authorize you to use the network.
The ‘native’ data communication happens using 3G or 4G networks. These are mechanisms by which your phone can relay information between any other computer (server) that would basically be reachable via Internet. When you load a Web page, read your email, or do anything that comes from the Internet, your phone will most likely use these networks (but the preference, order of networks is configurable).
Note that 3G and 4G data has a real price. There are several billing plans (‘data plans‘): by amount of data being transferred, a fixed data plan (monthly costs), temporary data plan (usually bought for one day) – actually too many to exhaustively list all of them. What matters is that you know your plan type.
The most dangerous situation is when you are not sure about what kind of plan you have, and still continue using the data features: this puts you into jeopardy of getting a very hefty bill the next time your operator sends you one. So: be sure! Use the Internet (on a PC), or call your operator to get timely information of the plans.
What’s the future of mobile computing and phones?
– “no one knows”
– ..but still we might have good guesses
Understanding the market drivers
– companies
– temporary financial drivers in components and manufacture profitability
– large handset manufacturers are also interested more and more about network quality
– B2B sales drive security needs: corporations are obliged by law to have security standards, and part of this is the mobile security
Many smartphones become severely “impeded” without a good quality mobile network. The phone/network symbiosis has become evident. A way to test that out is to turn off the mobile data in your own phone. Then the apps will (only) function mostly when you’re in the reach of a free Wifi. Some apps however are useful even in this off-line usage mode.
One can still home in to Wifi networks here and there, but without mobile data coverage, the phone
Theoretically ordinary mobile operator’s services could be hosted on a server, and let the phones roam in Wi-fi -like networks. Thus this would open up the traditionally very “closed-domain” role of a mobile operator. But there are strong economic incentives that direct the path of technological advances in things like mobile networks: carrier’s one of the most valuable and hard-to-imitate asset is exactly this “core mobile network”.
There have been these kind of tests; for example in the city of Oulu, Finland. Oulu is one of the birth places of mobile telecommunications, and has a pioneering attitude to develop even radically new kinds of mobile paradigms. City of Oulu provided a “Finland first” in communal, free wireless network.
Why haven’t the data speeds grown faster?
Reality vs. what we think
– network coverage
– true average attainable speeds
What’s keeping us back from enjoying high-quality, non-interrupted TV on a mobile phone? The answers are variable, and there’s a lot behind the development. Some of the things we’re experiencing are because of heavy standardization processes. Mobile communications is a field that needs rules for the players, in order to make it possible for a large group of users enjoy the experience. If there was no standardization of radio frequency use, we would essentially be electronically jamming each other and no one would be able to get data through.
Networking in general had problems in the 1970s and 1980s due to inconsistent technologies, which meant that even within a single building there might have been half a dozen different network technologies. (By the way, Cisco – the network company – was built out of the vision that these networks should be interconnected and working together!)
This same kind of problem was troublesome for WiFi, too – at first incompatibilities seemed to be a major obstacle to success. When technology is in its infancy, consumers are often very skeptical – and, when the benefits start to show up, so do new customers line up.

Final words

Mobile networks arose from a simple idea: carry voice and data, through a backbone, and “surface” the coverage with a radio connection. Thus you would create a network that would allow you, the user, roam around freely and still be connected. The first analog mobile networks were already tried in the 1970s. GSM standard was a major miletsone in the unification and standardization of digital communication in the mobile networks. Data usage has overtaken voice during the 2000s. Cravings for ever greater speeds, coverage and flexibility drives the evolution of mobile networks, now entering the 5G era in around 2018-2020 throughout the world.

DONT’T CLIMB THE MASTS…

By the way! img_8167 I really don’t recommend scaling the mobile masts. Rather, pass this article on, and/or drop a question here, right in the comments. This way we’ll get much more information on this modern day marvel of mobile networks.