MVVM (design pattern)
This is an article about the MVVM design pattern. It previous articles I erroneously called it the MVP pattern beacause I misunderstood the role of presenter in the latter.
In the present article I decided to gather the data that I lately scattered among some other articles, most notably I am going to talk abot the ViewModel class in Android.
The project
The example used in this article is MVVM.
The problem
- Creation of a model that responds to events on the device (activity recognition).
- Creation of a viewmodel that presents the most probable activity and its confidence.
- Creation of a mock model that generates test events and is retrieved with inversion of control for the sake of Espresso tests.
The structure of the project
I divided the relevant parts (model, view, viewmodel) into corresponding packages, respectively pl.org.seva.mvvm.model
, pl.org.seva.mvvm.view
and pl.org.seva.mvvm.viewmodel
.
In a project for production, where I would probably have plenty of different ViewModel
classes, I would probably divide them by domain (another package for settings, anothrer for entering and editing data, another for logging in etc.). I wouldn’t use package names like ‘model’ or ‘view’ at all, although I would use suffix -ViewModel
in the name of every class that extends the abstract ViewModel
.
The data (the model)
The class containing the result (description of the most probable activity and its confidence) of the activity recognition is the following:
Activity recognition interface (the model)
This is the interface that contains the Observable
for the activity recognition:
Notice that there is no LiveData
above. (It will be only created when I instantiate the ViewModel
).
The val ar
before the definition of the interface
is a global constant that contains the reference to its implementation. It is initiated lazily. The binding that defines how it is done is discussed in the following section.
Inversion of control
In this article an instance of the Observable
is retrieved lazily with the help of Kodein. This is the binding:
In the above line of code ctx
stands for application context.
Another binding for the ActivityRecognitionObservable
is used for Espresso testing. It will be described later, in a section dedicated to tests.
The following section will describe the implementation of ActivityRecognitionObservable
using device’s sensors.
Sensor implementation
This is the whole code of the class that recognizes the most probable activity:
First lets focus on the actual BroadcastReceiver
.
The class
that extends the BroadcastReceiver
sets a description (the name) and the confidence of the most probable activity.
At first I was tempted to use an array of String
s retrieved from the resources, instead of a when
statement, but then I discovered that according to DetectedActivity
doco there is no activity for the value of 6
, so I had to stick to retrviewing every String
individually.
Setting up a GoogleApiClient
is beyond the scope of this article, so just focus on these two lines at the beginning of the class
implementation:
The above lines create a private Subject
and share it out as a a public Observable
.
The ViewModel
This is the abstract
ViewModel
that is used in the project:
The above code creates a private instance of CompositableDisposable
and defines how each instance of LiveData
is created. The instances of LiveData
are initiated lazily, and whenever one is created, immediately a Consumer
is created setting the value on the main thread, and the result is added to the CompositeDisable
, and the CompositeDisposable
is disposed when the ViewModel
is discarded at the end of its lifecycle.
Because both the ActivityRecognitionObservable
is created lazily in this project, as well as the LiveData
that is using it, the actual registration of the BroadcastReceiver
is called only when the LiveData
is used for the first time when it is observed.
Please remember to define the return type of disposableLiveData()
as Lazy<LiveData<T>>
. By doing this you hide the real type of the MutableLiveData()
created here, and therefore it will be seen as immutable by the code that is using it.
When the CompositeDisposable
is disposed at the end of ViewModel
’s lifecycle, it gives a chance to every Observable
to perform its cleaning up, but I didn’t choose to unregister the BroadcastReceiver
here, because I do not know when it is going to be used the next time. (For instance the device could show another Fragment
using it, which would create a need to register the BroadcastReceiver
again. I am not unregistering the BroadcastReceiver
at all, although I can always add some code to the ActivityRecognitionObservable
that either unregisters the BroadcastReceiver
automatically, when the Observable
is no longer observed, or is to be called manually.
This is the actual implementation that observes the ActivityRecognitionObservable
:
The view
These are the relevant lines of the actual Fragment
that is using the above ViewModel
:
There is no need to initiate the ViewModel
lazily in this particular case. The LiveData
is initiated lazily anyway, so it makes no difference whether you initiate the ViewModel
lazily or eagerly here. However, in some of my projects I initialize the ViewModel
lazily, because I want to use it outside of the scope of onActivityCreated()
. If you want to learn how to do it lazily, you can read a separate article in this blog.
Just this call getViewModel<ActivityDescViewModel>()
creates eagerly the instance of ViewModel
, but doesn’t initiate the LiveData
yet.
This how initialization of the ViewModel
works under the hood:
A subsequent invocation vm.activityDesc
initiates the SensorActivityRecognitionObservable
, registers the BroadcastReceiver
, subscribes to the Observable
and starts posting values to the LiveData
being referenced here.
Because you will probably not want to refer to an instance of the LiveData
thus created inside the Fragment
in any other way that to actually observing it, I shortened the syntax by creating an extension operator:
You can just invoke the LiveData
with a LifecycleOwner
and an Observer
and it will observe the LiveData
for you. To remind you the syntax that calls the above operator
function:
Alternatively, you may write:
Running
When you run the application, after about one second you should start seeing the description of the activity recognized by your device’s sensors. You can test it manually by shaking the device for a couple of seconds, or you can test it automatically using the way described in the following section.
Testing
To mock the ActivityRecognitionObservable
you have to create another binding. This is the module that is creating it:
This is the implementation:
This is the code that imports the Kodein module:
And the AndroidJUnitRunner
that starts the above MockApplication
:
To use this particular AndroidJUnitRunner
you have to register it in your module’s level build.gradle
:
This is the actual test:
Delaying an Espresso test
The test described in the above section uses a couple of delays to wait for the simulation to generate another event. This is the ViewAction
that performs this delay:
Conclusion
In the above example you had a chance to learn how to use Kodein to create an Observable
that reacts to events coming from your sensor. You also could learn how to create a mock implementation.
I used the singleton pattern in conjunction with Kodein to create an Observable that is triggered by an unlimited sequence of events. Although I did not implement it above, I could create a function that stops the Observable
, invoked either manually or automatically when the last observer unregisters.
I explained how to crate a Kodein module, visible only during tests, that creates bindings for the mock instances.
I decided to use RxJava in the project. I didn’t want to create coroutine channels before their implementation becomes stable in Kotlin 1.4.
I used lazy initialization wherever appropriate, in order to only register the BroadcastReceiver
when it is required to observe an instance of LiveData
.
I hope that by writing the present article I’ve exhausted the topic of MVVM, which actually took me a couple of monts to understand and integrate with my present knowledge of Android Jetpack. By writing this and a couple of previous articles I hope to have demonstrated one man’s journey to refining a sensible architecture.
Donations
If you’ve enjoyed this article, consider donating some bitcoin at the address below. You may also look at my donations page.
BTC: bc1qncxh5xs6erq6w4qz3a7xl7f50agrgn3w58dsfp