Earlier this year, Datarella and Wirecard AG started a collaboration around a couple of blockchain projects. One of them was RAW coin, the trading of commodities made more efficient through the usage of blockchain.
We started by a thorough analysis of various commodity supply and trade chains, which led us to find the following challenges of supply chains:
rising pressure from global competition
many intermediaries and complex governance structures
the end-consumer is demanding ever-higher levels of transparency
struggle for supply chain stakeholders to maintain an adequate overview of their networks and the supply costs associated
difficulty to ensure the quality and integrity of raw materials
We decided to narrow the PoC down to one specific use case, namely coffee beans. The reason for this is not simply that we’re huge fans of (good) coffee, but actually, coffee is the second most sought after commodity after crude oil, globally. It has a trading volume of $100B per year and is grown in 50 countries (in some of which Wirecard offers financial services). Finally, the coffee supply chain has a large number of middle men and intermediaries adding marginal value but capturing a large amount of the end-price paid by consumers.
The basic idea of RAW.coin is to digitise trading mechanisms and replace middlemen. As a larger vision, we aim to establish the RAW.coin network to become the ecosystem for supply chains, ensuring the origin, quality, compliance and proper handling of items tracked by the network.
What the solution does is to connect the Producer and the Importer while providing a marketplace where any commodities can be traded. It’s based on Ethereum, using smart contracts and a modern decentralised architecture. We implemented smart contracts to represent the terms and conditions of the network as well as enforcing them at the same time through automation. Additionally, we integrated with Wirecard to provide truly seamless B2B payments and fiat interactions. All transactions are done using the cryptocurrency RAW, but which can be immediately exchanged into fiat using the Wirecard Gateway API.
This is an actual screenshot of the current product:
We are currently inquiring into the best way to scale up and apply it to other commodities and in which markets. If you wish to learn more, tweet us at @datarella or contact us!
Pac-Man on Wheels, the winners of third place (for USD 1 000 + 15 000 XSC tokens)
Democratize autonomous driving by crowdsourcing and gamifying the data collection via IOTA Tokens
Q: What problem does your project solve?
A: In order to empower autonomous driving hundreds of thousands hours of sensor data are required. Individual data acquisition without merging possibility is never able to gain the appropiate amount of data needed to secure all driving scenarios. This can only be achieved by crowdsourcing the data by incentivizing road users with IOTA Tokens. A decentralized market place ensures the complete data availability.
Q: What expertise and roles did you have on your team?
Dr. Simon Hassannia: Head of Business Innovation and Automotive Industry Expert
Martin Mihaylov: Computer Vision & AI Engineer, Entrepreneur, Designer, All things tech enthusiast
David Hawig: Iota, Ipfs and Ntru C# Developer
Dr. Simon Hassannia: Head of Business Innovation and Automotive Industry Expert
Martin Mihaylov: Computer Vision & AI Engineer, Entrepreneur, Designer, All things tech enthusiast
David Hawig: Iota, Ipfs and Ntru C# Developer
Q: Which technologies and tools did you use in your project?
A: NTRU for encryption, IPFS for decentralized file storage, IOTA for immutable decentralized data exchange and monetization, Xamarin for multi-platform application development
After we finished the Blockchained Mobility Hackathon with the 3rd place our team quickly set up communication channels to stay connected. Although our team has just found itself on the Hackathon we have a very strong attitude our project will successful be enhanced and implemented now. Because of the perfect mix between software and business developers and the seamless collaboration inside the team we have a strong starting point now. By attending at the DAHO.AM conference right after the Hackathon we received positive feedback from several attendees and company representative to have found a well-fitted business case here. The strong interest in “Pac-Man on Wheels” strengthened us once more to proceed with the project. We already spoke to interested companies and investors at the DAHO.AM and figured out some approaches how to get going. We are still in negotiations if the project will be kicked off as a start-up or, which is actually more likely, integrated into a project of an existing company. Since nothing is settled yet, we appreciate and want to encourage any interested party to get in touch with us to discuss common realization possibilities.
We’ve stirred much interest in the issuance of our XSC token at the IOTA hackathon in Gdansk. We therefore decided to prolong our rewards campaign for IOTA developers for 1 week:
If you’re a developer who committed code to advance the IOTA network during the month of November, you’ll be eligible. If you think you’re eligible you can request up to 250 XSC until Friday, 1 December 2017.
For more information on the CSC Blockchain Evolution Incentive Scheme, click here and here.
The IOTA Hackathon took place from Nov 17 to Nov 19 in Gdansk, Poland. Software developers from all over Europe came together to put to test the IOTA Platform with various use cases. The event was sponsored by IOTA, Baltic Data Science (blockchain and big data service), Datarella (blockchain and big data consultancy) and Bright Inventions (mobile app development). Four teams of developers and software experts formed around various use cases and competed for the prize money of 4,200 IOTA. Here in “Part 1″ we summarize the idea iteration process for the contest’s winning team „PlugInBaby” and the associated pivot that took place while defining the project topic. Part 2 describes the development and design of the project in more detail.
Defining the Need
We started the hackathon with a group brainstorming session followed by some informal voting and group building around the topics generated.
After narrowing the focus down to the topics, “Autonomous Agents” and “Decentralized Stack” the group focused on idea generation. Any potential topic needed to utilize the special characteristics of IOTA (scalability, speed, zero transactions costs) while avoiding limitations such as the lack of a Turing complete language and smart contract capabilities.
Initial brainstorming considered applications in manufacturing, autonomous transportation, supply chain management and distributed sensor technology. Eventually the basic idea of using IOTA as a distributed database allowing individuals or autonomous agents to identify free parking spaces in cities and also search for those spaces crystalized out of the brainstorming process.
After several hours of work on the concept and the potential implementations, we found structural problems with the plan. In our initial approach, the team imagined that individuals or autonomous agents/smart cars would identify free parking spaces, notify others of their presence by writing to the tangle and potentially be compensated for the service. A number of important questions were however left open with this topic.
Critical Questions that Lead to the Pivot:
Why should a system for finding free parking spaces be built using IOTA?
Wouldn’t another technology be more appropriate?
Why not use a blockchain which allows for smart contracts?
Would people really use such an app?
Pivot to an open car charging network
After several hours of discussion, the team still couldn’t adequately answer the above questions so we turned to another idea. Instead of logging free parking spaces, we would provide a link between an IoT network of decentralized charging stations and traditional or autonomous cars needing charging services.
Currently, electric charging infrastructure is almost always mediated by large corporations and organizations. This project seeks to change this. The team drew inspiration from ElaadNL which built a Proof of Concept (PoC) Charging Station for electric cars running fully on IOTA. Their charger is built using off the shelf tech and could be adopted by individuals who wish to offer electricity from their private microgrid or solar installations. What’s missing in the ElaadNL implementation is a user-friendly way to select and navigate to the charging station.
The ElaadNL PoC app works on a “Pull” basis where the user has to enter a charger code to search for the status of a particular charger. The team wanted to design something that would work on a “Push” basis and push the location of open chargers to users within the familiar confines of a google map interface.
The team envision a world in which individuals could take an open-source IoT charger kit and set up an IOTA-based charging station wherever they have access to power and a parking space. This could open up a whole new layer of community-based decentral charging.
The project, so conceived was well matched with the strengths of IOTA. Scalability and transaction speed would be needed due to continual improvements in the speed of charging and the fact that the search mechanism of the system would have to operate very quickly to guarantee a good user experience. A system with zero transactions costs was also judged to be appropriate for the type of microtransactions that need to occur between a car and a smart charger enabling real-time pricing for electricity.
We owe a shout out to ElaadNL for their PoC. The existence of such charger allowed us to think in a modular fashion and abstract away the charger component to focus instead exclusively on the building a system to find the chargers and transact with them.
Here is an overview of all reports on the IOTA Hackathon’s projects:
The IOTA Hackathon took place from Nov 17 to Nov 19 in Gdansk, Poland. Software developers from all over Europe came together to put to test the IOTA Platform with various use cases. The event was sponsored by IOTA, Baltic Data Science (blockchain and big data service), Datarella (blockchain and big data consultancy) and Bright Inventions (mobile app development). Four teams of developers and software experts formed around various use cases and competed for 4,200 IOTA in prize money.
The team was joined by Bogdan Vacusta, who brought a real-life challenge from the London Council to the IOTA Hackathon. London Councils issue “Freedom Passes” to disabled residents which allow them to use public transport within the city free of charge. Prior to the issuance of a Freedom Pass, one must obtain a doctor’s certificate to prove disability.
Unfortunately, scammers have successfully been photoshopping doctor’s certificates for perfectly healthy London residents. No one knows for sure how many Freedom Passes have been issued under false pretenses but the number is likely in the thousands. London Councils have no procedure in place to verify if a certificate has been faked. A simple alert, when one doctor has issued an unusually high number of certificates would be a huge step in successfully detecting fraud. This step alone could save the public tens of millions of Pounds per year with the help of IOTA.
The team’s task was to build a Proof of Concept (PoC) to prevent fraud. So, how can IOTA be used for fraud detection?
2. The Solution
The team decided to create a transaction from the doctor to the applicant, thus certifying the disability of the applicant on the IOTA Tangle. If an anomaly in the number of issued certificates of a doctor occurs, the system alerts the London Councils.
In an ideal scenario, the doctor would issue this digital certification from an app (mobile or web based), signing the transaction with her private key (this measure would actually help prevent fraud). Given the short timeframe at the IOTA Hackathon (less than 24h), the team chose to create sample data and to carry out the transaction on the doctor’s behalf for the PoC. A local database would be fed the details of the doctor and the applicant, as to identify them. So, the system for the PoC was to include the following components:
An input form for doctor and applicant data
An interface to the IOTA Tangle
A database with doctor and applicant data
A backend which analyses the data
A frontend for the London Councils with a list of alerts
3. The Process
Here’s is how it works:
1. Entering the data
The data is written to the local database. Simultaneously a transaction – symbolizing the disability certificate – from the doctor to the applicant is immutably written to the IOTA Tangle. The transaction ID is, in turn, written to the local database adding the ability to prove that the certification has taken place.
3. Analysis & Reporting
The backend analyses the data and alerts the officials in case of any anomalies. ie. (If one doctor has issued unusually many certificates within a certain time frame.)
What we learned
We completed our goal within the timeframe despite running into issues due to working with an immature system along the way. In the end, we managed to create a Proof of Concept perfectly suitable for the setting of the IOTA Hackathon.
We did run into a few issues along the way which must be addressed by the IOTA team in order to improve the system and make it fit for future use cases:
Speed of transactions: On the IOTA testnet we experienced long wait times when confirming transactions. Submitted transactions confirmed in ~1 minute, reading transactions took circa ~3-5 minutes or more depending on the amount of data. This may be a testnet issue independent of the mainnet.
The Documentation was not up to date, there was missing information and what documentation existed was somtimes misleading (i.e. Properties marked as optional are actually required, not obvious that a replayTransaction function creates a completely new transaction, sending a message instead of a transaction the sender is not documented on the tangle …)
Releases are not scheduled in advance, if an update is run during development, developers must adapt quickly to accommodate changes. A roadmap by IOTA for releases would be very helpful.
Node.js SDK is based on “callbacks” (an old technology standard), not on “promises” (current technology standard).
The API can easily be misused. Values and properties that shouldn’t be passed can go through without any error message. The API is missing descriptive error messages, leaving developers in the dark when it comes to hunting down bugs.
So, why IOTA?
Frist off, one might argue that this task could have been done with a regular database entirely. While this is true, a database is a lot easier to attack by hackers than a blockchain or tangle. Also, this kind of system could have been set up on a blockchain system s.a. Ethereum, why use IOTA? Well, the challenges blockchain systems are struggling to overcome are performance and scalability. Due to block sizes, transaction times constantly increase – thus making the systems less usable for scenarios in which transactions must happen near instantly.
IOTA helps to solve the problems of performance and speed of transactions. The team is in agreement that the IOTA Tangle and similar “non-block” chain approaches are likely to be most feasible to enable scalability in future. Also, an application using IOTA can quite easily be transferred to related use cases.
Would the team recommend using IOTA for fraud prevention?
The answer is Yes, if the long term goal is to further develop IOTA in general. The answer is No, if the system should be used in a productive environment at this point, since it is still immature. Alternative systems which currently are more mature and could be used for the task include Hyperledger Fabric, Sovrin and Ethereum. These blockchain systems pose scalability issues in the future, whereas development here is also ongoing.
The IOTA application “Freedom Pass” is very well scalable and transferable to related use cases. However, IOTA must undertake massive improvements regarding performance s.a. speed and documentation as well as for the API and SDK/node.js. If the above issues are continuously improved, the team recommends IOTA for further developing this kind of system for the public. IOTA promises future potential for the public for reconciliation of data, reduction of duplication, auditability, authentication.
This is the second installment in our posts about the experiences of the “Freedom Pass” team during the IOTA Hackathon. In the first post (found here), Kira set the stage and explained the current issues of the London Freedom Pass. In this post, we’ll get a bit more detailed with regards to how we built the project.
DISCLAIMER: Even though the project is called “Fraud Detection” the technological focus is very much on IOTA and not at all on machine learning-methodologies or data science, as one would commonly associate with fraud detection and prevention.
After we’d narrowed the scope down sufficiently to what we thought would be achievable during a hackathon, we started getting familiar with the IOTA tangle. We followed this tutorial for making a simple transaction, written only a few weeks earlier but already with some modifications required. After having gotten ourselves familiar with the general concepts of the Tangle (much accelerated by a presentation and Q&A by ChrisDukakis of IOTA) we connected to a testnet node and started issuing transactions.
Before we get into the details of the project, I’ll make a short comment about the decision whether to run a full node, the IOTA Reference Implementation (IRI) or to connect to pre-existing nodes. In short, to run the IRI, one needs a complete Java Runtime Environment, which is one of the reasons why IOTA can’t be run on an IoT device at this point. Each node connected to the tangle exposes an HTTP API through which transactions can be issued. To set up an instance of the IRI, one has to acquire the addresses of the already connected nodes in the tangle. The recommended way to do this is by asking people in the slack-channel #nodesharing. Because of the above restrictions and our requirements in time, we didn’t think it would be necessary to run our own node.
Register Doctor as a seed on the tangle
Register Applicant as a seed on the tangle
Perform a transaction for each certificate between the issuing Doctor to the Applicant.
Verify that a certificate was registered on the tangle given a Doctor and an Applicant
Read information off of the tangle about outgoing transactions from all Doctors
Given the above functionality, how could we leverage the existing IOTA library in the best way possible? Well, since smart contracts or most types of advanced transactions aren’t really possible on IOTA (yet), we will need some off-tangle processing, storage and UI.
For this, we implemented a backend and some wrapping to process the information from the applications. The server-side was written using Node.JS and the express-framework. To model the logic and structure of the database, we used MongoDB and mongoose. The MongoDB contained a simple key-value store, saving relevant applicant information. One could imagine that is could be upgraded to a graph-model to better mirror the tangle structure and to be able to more efficiently analyse connections between Doctors and Applicants, however, that was out-of-scope during the ~24h of coding we had.
In order for the user to interact with the tangle in an easy way, we built a small web-frontend. It allows the user to enter information about an application such as the national insurance number of an Applicant, postal code of the Doctor and Applicant, phone numbers, etc. At this stage, four things need to happen:
The information is saved in the MongoDB-collection,
seeds for the Applicant and Doctor are created based on an aggregate of identifying information,
new test tokens are generated and sent to the Doctor’s account and
an IOTA transaction is issued from the Doctor to the Applicant.
To save the information into a MongoDB-collection a controller instantiates and returns a new model containing the just entered data. It passes it on to the server.jswho handles the HTTP-requests from the client.
There is no dedicated IOTA API-call for generating seeds, but they do supply a command line command for generating a random seed. We made our seeds relatable to the private information by concatenating the private key with the national insurance number for the Applicants and the Doctor’s ID for the Doctors. After the seed was generated, a fresh address is created for each new transaction.
To make the functions from the iota.lib.js a bit more usable, we wrapped the existing callbacks-based structure in Promises. This allowed our code to become a bit more asynchronous than it is ‘out-of-the-box’.
Here is an overview of the architecture:
Once the data and the transactions were issued, the next step was to provide a way of viewing the existing applications and certificates. So we created a second page of the UI for listing all applications with relevant information read from the MongoDB-collection.
This doesn’t, however, provide such a great way of finding the main type of fraud that we were considering, namely Applicants reusing information about Doctors. This makes it look like a single Doctor issued an unreasonable amount of certificates. A pretty easy case to catch, one would think, but considering it is a completely analog process done by on paper in different boroughs by different administrators, it sums up to quite a large amount of faked applications. This is the type of fraud we focussed on in our processing.
So how can we in a user-friendly way flag cases that should be investigated? We chose the simplest option and created a second view of the UI where each Doctor in the system is listed along with the number of certificates they’ve, supposedly, issued. The list is sorted by the number of certificates issued. Here one could imagine making it a bit smarter by including the date the certificate was issued and creating a more differentiated metric of certificates per time unit, but it wasn’t in scope this time around. If a Doctor issued more than 10 certificates, they were highlighted in red. A very simple but potentially efficient way of communicating to the user that something needs to be investigated. Of course, the number 10 was completely arbitrary and could have been chosen differently. In fact, to decide that number, one would have to, first of all, analyze historical data.
To sum up, Team Freedom had a lot of fun and learned tons about IOTA, ideation, cooperation, and creation in a short time-frame. We managed to build a functioning Proof of Concept for how IOTA can be used for the secure issuing of medical certificates in order to prevent and detect fraud. The application to the Freedom Pass was done so that it would be easier to understand what was being done and why. But that does in no way mean that the base structure cannot be used for other purposes, in fact, it was written specifically to be general enough that it is also interesting in other areas.
Is this the only way that the problem could have been solved? No. Was it the easiest way of solving it? Absolutely not. However, we believe that only by experimenting and utilizing one of the few scalable and future-resistant distributed ledger solutions can we achieve applicability. There is, generally speaking, almost no distributed ledger application that could not have been done without the use of a distributed ledger, but it would have incurred great financial, organizational or trust costs. IOTA is a very cost-effective and scalable solution, but with the caveat that it is still in its infancy.
Here is an overview of all reports on the IOTA Hackathon’s projects:
We at Datarella are strong believers in blockchain technology. We have been working on blockchain projects since 2015 – with leading índustry players and organisations. Since we are platform-agnostic, we have worked with Bitcoin, Ethereum, Hyperledger, IOTA, and other blockchains.
One key takeaway of two years of blockchain experience is, that – in the fall of 2017 – most blockchains are still quite immature and a lot of work has to be done in order to make them industry-ready. We see a huge demand for development and investment in not only blockchain-related projects but also in the core blockchain protocols. The key development challenges in 2018 will be to significantly improve the scalability, the stability and the security of blockchain platforms.
As digital organisms fed by communities of developers, blockchain protocols evolve through changes in their code, i.e. either by changes to the original code or through adding a new microorganism – a side chain – by forking the original chain. Both, changes to the original code and forks, could be combined by creating a forkless blockchain with specific rules in the protocol that are created by other rules (see: the Nomic game ). This way, forks would not be needed anymore since rules could be changeable by other rules.
Most industry blockchain projects are developed using side chains. First, that’s to eschew the disadvantages of public blockchains, s.a. PoW, and then, it’s because of the lack of industry-grade conditions in public blockchains. Most, if not all, industry-led blockchain project teams would love to use public chains if they could be used in a reliable way.
Tragedy Of The Commons
That said, strong evolutionary processes in blockchains are needed. But, where’s the incentive for developers to invest resources into the core protocols? The only way to benefit from working on core blockchain protocols is mining tokens and profit from a potential increase in value or joining on elf the blockchain’s foundations and getting paid by them. This imbalance of having no incentive to work on a core technology which everybody would like to see well developed is called the tragedy of the commons: the economic reward for a developer improving blockchain technology is low.
Funding work on core blockchain protocols and thereby the creation of incentives for developers could be provided by private institutions, s.a. Venture Capital (VC) firms, and by public funding, e.g. through a public crowdfunding initiative: the Ethereum foundation could sell Ether through a crowdsale to the developer community working on a specific update in Ethereum’s evolutionary process. For the blockchain’s foundations that would be straightforward thinking.
VCs, however, would have to make sure that their assets, i.e. portfolio companies, profit from an investment in the core blockchain protocol. This could be done indirectly, if blockchain projects don’t need to develop certain functionalities which are already woven in the core protocol, and therefore minimize their efforts and streamline their roadmaps to exit. It can be questioned if that’s an adequate benefit from the VC‘s perspective.
Crowdstart Capital dedicates tokens to the blockchain developer community
With our sister company Crowdstart Capital (CSC) we are planning to address the funding challenge described above. Crowdstart Capital’s goal is to foster blockchain core technologies and applications. CSC wants to contribute to helping blockchain evolve into an enterprise-ready technology. In order to lay a basis for a cryptoeconomic incentive scheme to support the development of blockchain-related projects and to provide incentives to developers to dedicate their work to blockchain‘s core protocols, XSC tokens will be dedicated to the active blockchain community.
Developers committing code to key blockchain projects can opt in to receive XSC tokens for every line of code that is accepted for the respective projects. CSC will set up a smart-contract-based system that will pay out the tokens according to the commits. This incentive is meant as CSC’s contribution to the blockcahin developer community – there will be no further obligations, i.e. CSC does not demand any return for this.
Technologies to be supported by these incentives include the core protocols of leading blockchains, s.a. Ethereum. Also, all projects that participate in the CSC acceleration program are supported. In a second phase it is planned, that members of the community will be able to suggest projects to be included in the incentive scheme. Which project should be included will be voted for by the community in token-based ballots.
We know that we won‘t achieve our goal over night. And we know that we might adapt our plan when necessary. Finally, the most important factor is the blockchain community itself. If we can successfully motivate blockchain developers to join the scheme, to use the XSC tokens and to spread the word to their respective communities – then we can potentially crowdstart something new: an efficient incentive scheme for the evolution of blockchain technology.
If you’re responsible for new product development in your company, you will be familiar with the several steps of that process. Experts mostly separate the new product development process into seven or eight steps, starting with idea generation and finishing with a post launch review. The fact that more and more things become smart; i.e. they either feature some intelligence or they are connected and controlled through the IoT, has significant implications on new product development, particularly on its very first phases.
Traditionally, ideation and screening of first product ideas have focused on research, brainstorming, SWOT analysis, market and consumer trends, and so forth. All these activities imply certain hypotheses and more or less tangible perceptions of products or product components. This works fine, as long as the final product is a one-way product; i.e. once produced and sold it won’t change (other than to age and break, ultimately). However, smart things aren’t on-directional, but bi-directional: they communicate, they change, and therefore their effects on consumers are far more complex and variable than those of their “dumb” predecessors.
The smarter a thing, or a group of things, is, the more complex the situations they will create for their environment and their users. The much discussed self-driving cars which algorithms must decide whom to run over in case of an inevitable accoident provide a good example of the complexity future products will create.
Now – what are the implications of smarr things and the IoT on new product development? The answer is pretty easy – we just have to look at the discussions regarding the IoT: privacy, responsibility, sustainability, awareness, acceptance, relevance, and ethics. Is my data secure? Who takes responsibility of data provenance? Do I want this thing to be smart? Do I accept a thing’s decision? Do things add value? Do others accept me using my smart thing? Can I defend using my smart thing against my beliefs?
We can sort these crtical questions into three categories:
– philosophy (ethical aspects),
– sociology (responsibility/acceptance aspects) and
– psychology (awareness/relevance aspects).
Philosophy, sociology and psychology are the “new” fields for benchmarking new product ideas. As distinct from present techniques of finding new product ideas, corporate innovation managers will have to broaden their scopes and companies will have to adapt by hiring and training their innovation departments towards these fields of expertise. Today, only very few companies seem to have inherited this new way of thinking: just look at how Apple creates and markets its products: there is no talk of product features, but of sustainable production chains, of family accounts or enhanced well-being.
Would you have thought that philosophy, sociology and psychology would play a pivotal role in new product development? Could that mean that philosophers sleeping in ceramic jars now can afford posh apartements, or formerly unemployed sociologists can choose their employers, or psychologists leave their universities to actually develop new products? And – will we see a lot mote useful, meaningful, usable and accepted products? I think so.
Too much workload, stress and ultimately the burnout – that’s how many people see their everyday life. One way to handle the negative aspects of daily routines is to make it to the weekend (TGIF), another is to go on vacation. Whereas the first tactic is easy to realize but only helpful to a certain degree, the latter is possible once or twice a year for most of us. But there is another, more easy way to calm down and to boost your wellbeing and happiness: create and repeat small positive experiences – and you will see an immediate effect on your overall awareness of life.
As Sonya Lyubomirsky and Kristin Layous show in their paper, based on research by Ed Diener and others, it’s the small and regularly repeated positive experiences which influence your wellbeing and happiness to a great extent. According to the Positive-Activity Model, features of positive activities, including their dosage, variety, sequence, and built-in social support, all influence their success in that process.
For our editorial team at Datarella, this model was a challenge: how could we use the explore app to get this model work in an optimal way? As always, the team decided not to head for the optimal – but for a good solution, to invite volunteers to participate in a special program and ultimately to optimize the program together with the explore users. This program, SMILE!, should be designed very lean, with just a minimum number of interactions, and with an active participation for just a few days, in order not to interfere with the model’s cause-and-effect relationships.
After the 5-day-program, our data team analyzed the results. In short: our findings completely back the findings of Ed Diener et al.:
participants of the SMILE! program experienced a significant increase of their happiness with each additional day during the program
participants of the SMILE! program experienced an increase of their happiness compared with a test group of non-participants whose happiness level remained constant
small, regular and well-portioned challenges triggered a change of the participant’s behavior resulting in an increased happiness level
The two charts below demonstrate the SMILE! effect:
(For non-german speaking users: Translation Chart 1: “Did you laugh a lot, today?”, Translation Chart 2:”Do you think that you laughed more often at the end of the program?”, Translation Feature Visual:”How do you feel at the moment?)
The Datarella team itself participated in SMILE!, too. For me personally, it was a great experience. Being an optimistic guy and smiling often, the SMILE! challenges opened my eyes: in reality I have been smiling much less than I had thought. And triggered by the SMILE! challenges I was forced to become much more friendly.
For Datarella, the SMILE! program was a first test. We are planning to roll out several programs of this kind, all of them aiming to boost personal wellbeing and happiness. Since our editorial team is still in the process of creating these programs we’d like to invite you to participate and add your ideas, proposals and thoughts! We’d love to hear from you!