Architecture
Contents
Overview
Sailfish OS is a mobile operating system based on GNU/Linux.
The Sailfish OS architecture is primarily made up of three areas hardware adapation layer, the middleware layer and the app/UI layer:
Hardware Adaptation layer
In the hardware adaptation layer, Sailfish OS uses a Linux kernel with hardware-specific additions.
Sailfish OS can run on top of standard Linux hardware with native drivers, or one can utilize the drivers for an Android-compatible hardware via libhybris, which bridges Linux libraries (based on GNU C) with those based on Bionic, such as Android. Building an adaptation for Android-compatible hardware is instructed in HADK documentation.
Middleware layer
In the middleware layer there are core system components for building services above hardware layer.
The Qt C++ application development framework provides the primary development libraries. Aside from the main Qt modules, Sailfish OS uses add-on modules such as Qt Maps, Qt Sensors and Qt Contacts. Also, all Sailfish applications are written with QML, a Qt technology for easily building user interfaces into C++ applications.
Sailfish OS also includes a large range of middleware libraries and frameworks that service the application layer, more details on the location for the sources please refer to Sailfish OS Source. They are written in C/C++, and libraries that are directly accessed by the UI layer include QML modules to allow them to be used by QML-based applications without additional QML/C++ bindings.
Application and UI layer
Sailfish OS applications are written in a combination of C++ and QML/Qt Quick. QML is a Qt technology primarily used to declaratively assemble application user interfaces and connect them to C++ backend code, and Qt Quick is a core part of the QML framework for UI creation. A Sailfish OS app typically defines the UI in QML, and if necessary, includes C++ utility code to execute further functionality that is otherwise unavailable from the QML layer.
Application launching and lifetime is controlled by Lipstick, which provides the essential user-session UI with an application launcher and other main screens, and also acts as the window manager.
Call chains
Here are few call/usage chains of components in Sailfish OS from ux/middleware to the hardware adaptation driver. The parts that mention droid/libhybris/binder/Android HAL are dependant a bit on Android BSP driver version of the device and for native adaptations the chain looks different.
Should be noted that Sailfish OS does not have kernel provided with the OS but kernel is something that is provided by the Hardware Adaptation layer. Currently lowest supported kernel is 3.4 (which needs some patches), and it is recommended to use kernel 4.4 or newer. There is configuration check script that is used to verify that kernel provides all required functionalities.
Area | Call chain | Notes |
---|---|---|
Audio | pulseaudio <> pulseaudio-modules-droid <> libhybris <> audioflingerglue <> Android BSP libbinder <> miniaf <> Android BSP HAL: audio | |
Bluetooth | Bluez5 <> kernel VHCI <> bluebinder <> libgbinder <> Android BSP HAL: android.hardware.bluetooth | Android BSP >= 8 |
Camera/Multimedia | gst-droid <> libhybris <> droidmedia <> Android BSP libbinder <> minimedia/minisf <> Android BSP HAL | |
Display | mce <> mce-plugins-libhybris <> libhybris <> Android BSP HAL: gralloc or hwcomposer | |
Fingerprint | sailfish-fpd <> sailfish-fpd-slave <> libgbinder <> Android BSP HAL: android.hardware.fingerprint | Android BSP >= 8 |
Graphics | qtbase <> qt5-qpa-hwcomposer-plugin <> libhybris-compat-library: libhwc2_compat_layer <> libgbinder <> Android BSP HAL: android.hardware.graphics.composer | Android BSP >= 8 |
qtbase <> qt5-qpa-hwcomposer-plugin <> libhybris <> Android BSP HAL: hwcomposer | Android BSP <= 7 | |
LED | mce <> kernel | |
mce <> mce-plugins-libhybris <> libhybris <> Android BSP HAL: lights | ||
Location (GPS) | geoclue <> geoclue-providers-hybris <> libgbinder <> Android BSP HAL: android.hardware.gnss | Android BSP >= 8 |
geoclue <> geoclue-providers-hybris <> libhybris <> Android BSP HAL: gps | Android BSP <= 7 | |
Modem | oFono(ril driver) <> libgrilio <> ofono-ril-binder-plugin <> libgbinder-radio <> libgbinder <> Android BSP HAL: android.hardware.radio | Android BSP >= 8 |
oFono(ril driver) <> libgrilio <> socket <> Android BSP: rild | Android BSP <= 7 | |
NFC | nfcd <> nfcd-binder-plugin <> libgbinder <> Android BSP HAL: android.hardware.nfc | Android BSP >= 8 |
Sensors | sensorfw-qt5 <> sensorfw-qt5-hybris <> libgbinder <> Android BSP HAL: android.hardware.sensors | Android BSP >= 8 |
sensorfw-qt5 <> sensorfw-qt5-hybris <> libhybris <> Android BSP HAL: sensors | Android BSP <= 7 | |
Storage (eMMC & sdcard) | udisks2 <> kernel | |
Touch | app <> lipstick <> qtbase <> evdev <> kernel | See below |
USB | usb_moded <> kernel | |
MTP | buteo-mtp <> usb_moded <> kernel | See Architecture#MTP |
WiFi | connman (sailfish_wifi plugin) <> libgsupplicant <> wpa_supplicant <> kernel | |
Mobile data | connman (sailfish_ofono plugin) <> libgofono <> oFono | |
Volume keys | pulseaudio <> lipstick <> mce <> kernel | |
Power key | call-ui or alarm-ui <> mce <> kernel | Short keypress |
systemd <> dsme <> kernel | 5s+ keypress |
For more information on the areas covered by middleware libraries and services, see Core Areas and APIs.
MTP
usb-moded detects usb connection based on udev notification from kernel and initiates gadget configuration and starts buteo-mtp. buteo-mtp then finalizes the gadget configuration and handles data transmission.
Touch
mce-tools
package provides evdev_trace
command. Use --show-readers
option to figure out which device handles touch input (e.g. ABS_MT_TRACKING_ID
). See kernel documentation for more information about the multi-touch protocol.
$ /usr/sbin/evdev_trace --show-readers
Start tracing your touch screen (event2 is an example).
$ /usr/sbin/evdev_trace -t /dev/input/event2
Qt handles evdev touch events via evdev plugin (/usr/lib64/qt5/plugins/generic/libqevdevtouchplugin.so
on a 64bit device). Qt logging category would be “qt.qpa.input”. QEvdevTouchScreenHandler auto detects touchscreen.
Key Architectural Areas
An overview of some architectural areas and the APIs which expose the related functionality can be found in the page describing Core Areas and APIs. Some more in-depth documentation about key architectural areas follow:
- Cellular Telephony Architecture talks in detail about the mobile phone functionality
- Audio Architecture describes audio routing and sharing
- Screen display and application compositing is described in the Graphical Architecture
- Multimedia Architecture covers the camera and video subsystems
- The Qt Framework explains which Qt components applications should use to access features
- Android compatibility is enabled by the Android Emulation Framework