Welcome to Knowledge Base!

KB at your finger tips

This is one stop global knowledge base where you can learn about all the products, solutions and support features.

Categories
All

Mobile-React Native

React Native Performance in Marketplace · React Native

React Native Performance in Marketplace

· 5 min read

React Native is used in multiple places across multiple apps in the Facebook family including a top level tab in the main Facebook apps. Our focus for this post is a highly visible product, Marketplace. It is available in a dozen or so countries and enables users to discover products and services provided by other users.

In the first half of 2017, through the joint effort of the Relay Team, the Marketplace team, the Mobile JS Platform team, and the React Native team, we cut Marketplace Time to Interaction (TTI) in half for Android Year Class 2010-11 devices. Facebook has historically considered these devices as low-end Android devices, and they have the slowest TTIs on any platform or device type.

A typical React Native startup looks something like this:

Disclaimer: ratios aren't representative and will vary depending on how React Native is configured and used.

We first initialize the React Native core (aka the “Bridge”) before running the product specific JavaScript which determines what native views React Native will render in the Native Processing Time.

A different approach​

One of the earlier mistakes that we made was to let Systrace and CTScan drive our performance efforts. These tools helped us find a lot of low-hanging fruit in 2016, but we discovered that both Systrace and CTScan are not representative of production scenarios and cannot emulate what happens in the wild. Ratios of time spent in the breakdowns are often incorrect and, wildly off-base at times. At the extreme, some things that we expected to take a few milliseconds actually take hundreds or thousands of milliseconds. That said, CTScan is useful and we've found it catches a third of regressions before they hit production.

On Android, we attribute the shortcomings of these tools to the fact that 1) React Native is a multithreaded framework, 2) Marketplace is co-located with a multitude of complex views such as Newsfeed and other top-level tabs, and 3) computation times vary wildly. Thus, this half, we let production measurements and breakdowns drive almost all of our decision making and prioritization.

Down the path of production instrumentation​

Instrumenting production may sound simple on the surface, but it turned out to be quite a complex process. It took multiple iteration cycles of 2-3 weeks each; due to the latency of landing a commit in master, to pushing the app to the Play Store, to gathering sufficient production samples to have confidence in our work. Each iteration cycle involved discovering if our breakdowns were accurate, if they had the right level of granularity, and if they properly added up to the whole time span. We could not rely on alpha and beta releases because they are not representative of the general population. In essence, we very tediously built a very accurate production trace based on the aggregate of millions of samples.

One of the reasons we meticulously verified that every millisecond in breakdowns properly added up to their parent metrics was that we realized early on there were gaps in our instrumentation. It turned out that our initial breakdowns did not account for stalls caused by thread jumps. Thread jumps themselves aren't expensive, but thread jumps to busy threads already doing work are very expensive. We eventually reproduced these blockages locally by sprinkling Thread.sleep() calls at the right moments, and we managed to fix them by:

  1. removing our dependency on AsyncTask,
  2. undoing the forced initialization of ReactContext and NativeModules on the UI thread, and
  3. removing the dependency on measuring the ReactRootView at initialization time.

Together, removing these thread blockage issues reduced the startup time by over 25%.

Production metrics also challenged some of our prior assumptions. For example, we used to pre-load many JavaScript modules on the startup path under the assumption that co-locating modules in one bundle would reduce their initialization cost. However, the cost of pre-loading and co-locating these modules far outweighed the benefits. By re-configuring our inline require blacklists and removing JavaScript modules from the startup path, we were able to avoid loading unnecessary modules such as Relay Classic (when only Relay Modern was necessary). Today, our RUN_JS_BUNDLE breakdown is over 75% faster.

We also found wins by investigating product-specific native modules. For example, by lazily injecting a native module's dependencies, we reduced that native module's cost by 98%. By removing the contention of Marketplace startup with other products, we reduced startup by an equivalent interval.

The best part is that many of these improvements are broadly applicable to all screens built with React Native.

Conclusion​

People assume that React Native startup performance problems are caused by JavaScript being slow or exceedingly high network times. While speeding up things like JavaScript would bring down TTI by a non-trivial sum, each of these contribute a much smaller percentage of TTI than was previously believed.

The lesson so far has been to measure, measure, measure! Some wins come from moving run-time costs to build time, such as Relay Modern and Lazy NativeModules. Other wins come from avoiding work by being smarter about parallelizing code or removing dead code. And some wins come from large architectural changes to React Native, such as cleaning up thread blockages. There is no grand solution to performance, and longer-term performance wins will come from incremental instrumentation and improvements. Do not let cognitive bias influence your decisions. Instead, carefully gather and interpret production data to guide future work.

Future plans​

In the long term, we want Marketplace TTI to be comparable to similar products built with Native, and, in general, have React Native performance on par with native performance. Further more, although this half we drastically reduced the bridge startup cost by about 80%, we plan to bring the cost of the React Native bridge close to zero via projects like Prepack and more build time processing.

Tags:
  • engineering

React Native Monthly #3 · React Native

React Native Monthly #3

· 5 min read

The React Native monthly meeting continues! This month's meeting was a bit shorter as most of our teams were busy shipping. Next month, we are at React Native EU conference in Wroclaw, Poland. Make sure to grab a ticket and see you there in person! Meanwhile, let's see what our teams are up to.

Teams​

On this third meeting, we had 5 teams join us:

  • Callstack
  • Expo
  • Facebook
  • Microsoft
  • Shoutem

Notes​

Here are the notes from each team:

Callstack​

  • Recently open sourced react-native-material-palette . It extracts prominent colors from images to help you create visually engaging apps. It's Android only at the moment, but we are looking into adding support for iOS in the future.
  • We have landed HMR support into haul and a bunch of other, cool stuff! Check out latest releases.
  • React Native EU 2017 is coming! Next month is all about React Native and Poland! Make sure to grab last tickets available here.

Expo​

  • Released support for installing npm packages on Snack. Usual Expo restrictions apply -- packages can't depend on custom native APIs that aren't already included in Expo. We are also working on supporting multiple files and uploading assets in Snack. Satyajit will talk about Snack at React Native Europe.
  • Released SDK20 with camera, payments, secure storage, magnetometer, pause/resume fs downloads, and improved splash/loading screen.
  • Continuing to work with Krzysztof on react-native-gesture-handler. Please give it a try, rebuild some gesture that you have previously built using PanResponder or native gesture recognizers and let us know what issues you encounter.
  • Experimenting with JSC debugging protocol, working on a bunch of feature requests on Canny.

Facebook​

  • Last month we discussed management of the GitHub issue tracker and that we would try to make improvements to address the maintainability of the project.
  • Currently, the number of open issues is holding steady at around 600, and it seems like it may stay that way for a while. In the past month, we have closed 690 issues due to lack of activity (defined as no comments in the last 60 days). Out of those 690 issues, 58 were re-opened for a variety of reasons (a maintainer committed to providing a fix, or a contributor made a great case for keeping the issue open).
  • We plan to continue with the automated closing of stale issues for the foreseeable future. We’d like to be in a state where every impactful issue opened in the tracker is acted upon, but we’re not there yet. We need all the help we can from maintainers to triage issues and make sure we don't miss issues that introduce regressions or introduce breaking changes, especially those that affect newly created projects. People interested in helping out can use the Facebook GitHub Bot to triage issues and pull requests. The new Maintainers Guide contains more information on triage and use of the GitHub Bot. Please add yourself to the issue task force and encourage other active community members to do the same!

Microsoft​

  • The new Skype app is built on top of React Native in order to facilitate sharing as much code between platforms as possible. The React Native-based Skype app is currently available in the Android and iOS app stores.
  • While building the Skype app on React Native, we send pull requests to React Native in order to address bugs and missing features that we come across. So far, we've gotten about 70 pull requests merged.
  • React Native enabled us to power both the Android and iOS Skype apps from the same codebase. We also want to use that codebase to power the Skype web app. To help us achieve that goal, we built and open sourced a thin layer on top of React/React Native called ReactXP. ReactXP provides a set of cross platform components that get mapped to React Native when targeting iOS/Android and to react-dom when targeting the web. ReactXP's goals are similar to another open source library called React Native for Web. There's a brief description of how the approaches of these libraries differ in the ReactXP FAQ.

Shoutem​

  • We are continuing our efforts on improving and simplifying the developer experience when building apps using Shoutem.
  • Started migrating all our apps to react-navigation, but we ended postponing this until a more stable version is released, or one of the native navigation solutions becomes stable.
  • Updating all our extensions and most of our open source libraries (animation, theme, ui) to React Native 0.47.1.

Next session​

The next session is scheduled for Wednesday 13, September 2017. As this was only our third meeting, we'd like to know how do these notes benefit the React Native community. Feel free to ping me on Twitter if you have any suggestion on how we should improve the output of the meeting.

Tags:
  • engineering
Read article

React Native Monthly #4 · React Native

React Native Monthly #4

· 3 min read

The React Native monthly meeting continues! Here are the notes from each team:

Callstack​

  • React Native EU is over. More than 300 participants from 33 countries have visited Wroclaw. Talks can be found on YouTube.
  • We are slowly getting back to our open source schedule after the conference. One thing worth mentioning is that we are working on a next release of react-native-opentok that fixes most of the existing issues.

GeekyAnts​

Trying to lower the entry barrier for the developers embracing React Native with the following things:

  • Announced BuilderX.io at React Native EU. BuilderX is a design tool that directly works with JavaScript files (only React Native is supported at the moment) to generate beautiful, readable, and editable code.
  • Launched ReactNativeSeed.com which provides a set of boilerplates for your next React Native project. It comes with a variety of options that include TypeScript & Flow for data-types, MobX, Redux, and mobx-state-tree for state-management with CRNA and plain React-Native as the stack.

Expo​

  • Will release SDK 21 shortly, which adds support for react-native 0.48.3 and a bunch of bugfixes/reliability improvements/new features in the Expo SDK, including video recording, a new splash screen API, support for react-native-gesture-handler , and improved error handling.
  • Re: react-native-gesture-handler, Krzysztof Magiera of Software Mansion continues pushing this forward and we've been helping him with testing it and funding part of his development time. Having this integrated in Expo in SDK21 will allow people to play with it easily in Snack, so we're excited to see what people come up with.
  • Re: improved error logging / handling - see this gist of an internal Expo PR for details on logging, (in particular, "Problem 2"), and this commit for a change that handles failed attempts to import npm standard library modules. There is plenty of opportunity to improve error messages upstream in React Native in this way and we will work on follow up upstream PRs. It would be great for the community to get involved too.
  • native.directory continues to grow, you can add your projects from the GitHub repo.
  • Visit hackathons around North America, including PennApps, Hack The North, HackMIT, and soon MHacks.

Facebook​

  • Working on improving <Text> and <TextInput> components on Android. (Native auto-growing for <TextInput> ; deeply nested <Text> components layout issues; better code structure; performance optimizations).
  • We're still looking for additional contributors who would like to help triage issues and pull requests.

Microsoft​

  • Released Code Signing feature for CodePush. React Native developers are now able to sign their application bundles in CodePush. The announcement can be found here
  • Working on completing integration of CodePush to Mobile Center. Considering test/crash integration as well.

Next session​

The next session is scheduled for Wednesday 10, October 2017. As this was only our fourth meeting, we'd like to know how do these notes benefit the React Native community. Feel free to ping me on Twitter if you have any suggestion on how we should improve the output of the meeting.

Tags:
  • engineering
Read article

React Native Monthly #5 · React Native

React Native Monthly #5

· 4 min read

The React Native monthly meeting continues! Let's see what our teams are up to.

Callstack​

  • We’ve been working on React Native CI. Most importantly, we have migrated from Travis to Circle, leaving React Native with a single, unified CI pipeline.
  • We’ve organised Hacktoberfest - React Native edition where, together with attendees, we tried to submit many pull requests to open source projects.
  • We keep working on Haul. Last month, we have submitted two new releases, including Webpack 3 support. We plan to add CRNA and Expo support as well as work on better HMR. Our roadmap is public on the issue tracker. If you would like to suggest improvements or give feedback, let us know!

Expo​

  • Released Expo SDK 22 (using React Native 0.49) and updated CRNA for it.
    • Includes improved splash screen API, basic ARKit support, “DeviceMotion” API, SFAuthenticationSession support on iOS11, and more.
  • Your snacks can now have multiple JavaScript files and you can upload images and other assets by just dragging them into the editor.
  • Contribute to react-navigation to add support for iPhone X.
  • Focus our attention on rough edges when building large applications with Expo. For example:
    • First-class support for deploying to multiple environments: staging, production, and arbitrary channels. Channels will support rolling back and setting the active release for a given channel. Let us know if you want to be an early tester, @expo_io.
    • We are also working on improving our standalone app building infrastructure and adding support for bundling images and other non-code assets in standalone app builds while keeping the ability to update assets over the air.

Facebook​

  • Better RTL support:
    • We’re introducing a number of direction-aware styles.
      • Position:
        • (left|right) → (start|end)
      • Margin:
        • margin(Left|Right) → margin(Start|End)
      • Padding:
        • padding(Left|Right) → padding(Start|End)
      • Border:
        • borderTop(Left|Right)Radius → borderTop(Start|End)Radius
        • borderBottom(Left|Right)Radius → borderBottom(Start|End)Radius
        • border(Left|Right)Width → border(Start|End)Width
        • border(Left|Right)Color → border(Start|End)Color
    • The meaning of “left” and “right” were swapped in RTL for position, margin, padding, and border styles. Within a few months, we’re going to remove this behaviour and make “left” always mean “left,” and “right” always mean “right”. The breaking changes are hidden under a flag. Use I18nManager.swapLeftAndRightInRTL(false) in your React Native components to opt into them.
  • Working on Flow typing our internal native modules and using those to generate interfaces in Java and protocols in ObjC that the native implementations must implement. We hope this codegen becomes open source next year, at the earliest.

Infinite Red​

  • New OSS tool for helping React Native and other projects. More here.
  • Revamping Ignite for a new boilerplate release (Code name: Bowser)

Shoutem​

  • Improving the development flow on Shoutem. We want to streamline the process from creating an app to first custom screen and make it really easy, thus lowering the barrier for new React Native developers. Prepared a few workshops to test out new features. We also improved Shoutem CLI to support new flows.
  • Shoutem UI received a few component improvements and bugfixes. We also checked compatibility with latest React Native versions.
  • Shoutem platform received a few notable updates, new integrations are available as part of the open-source extensions project. We are really excited to see active development on Shoutem extensions from other developers. We actively contact and offer advice and guidance about their extensions.

Next session​

The next session is scheduled for Wednesday 6, December 2017. Feel free to ping me on Twitter if you have any suggestion on how we should improve the output of the meeting.

Tags:
  • engineering
Read article

React Native Monthly #6 · React Native

React Native Monthly #6

· 4 min read

The React Native monthly meeting is still going strong! Make sure to check a note on the bottom of this post for the next sessions.

Expo​

  • Congratulations to Devin Abbott and Houssein Djirdeh on their pre-release of the "Full Stack React Native" book! It walks you through learning React Native by building several small apps. You can try those apps out on https://www.fullstackreact.com/react-native/ before buying the book.
  • Released a first (experimental) version of reason-react-native-scripts to help people to easily try out ReasonML.
  • Expo SDK 24 is released! It uses React Native 0.51 and includes a bunch of new features and improvements: bundling images in standalone apps (no need to cache on first load!), image manipulation API (crop, resize, rotate, flip), face detection API, new release channel features (set the active release for a given channel and rollback), web dashboard to track standalone app builds, and a fix longstanding bug with OpenGL Android implementation and the Android multi-tasker, just to name a few things.
  • We are allocating more resources to React Navigation starting this January. We strongly believe that it is both possible and desirable to build React Native navigation with just React components and primitives like Animated and react-native-gesture-handler and we’re really excited about some of the improvements we have planned. If you're looking to contribute to the community, check out react-native-maps and react-native-svg, which could both use some help!

Infinite Red​

  • We have our Keynote speakers for Chain React conf: Kent C. Dodds and Tracy Lee. We will be opening CFP very soon.
  • Community chat now at 1600 people.
  • React Native Newsletter now at 8500 subscribers.
  • Currently researching best practice for making RN crash resistant, reports to follow.
  • Adding ability to report from Solidarity.
  • Published a HOW-TO for releasing on React Native and Android.

Microsoft​

  • A pull request has been started to migrate the core React Native Windows bridge to .NET Standard, making it effectively OS-agnostic. The hope is that many other .NET Core platforms can extend the bridge with their own threading models, JavaScript runtimes, and UIManagers (think JavaScriptCore, Xamarin.Mac, Linux Gtk#, and Samsung Tizen options).

Wix​

  • Detox
    • In order for us to scale with E2E tests, we want to minimize time spent on CI, we are working on parallelization support for Detox.
    • Submitted a pull request to enable support for custom flavor builds, to better support mocking on E2E.
  • DetoxInstruments
    • Working on the killer feature of DetoxInstruments proves to be a very challenging task, taking JavaScript backtrace at any given time requires a custom JSCore implementation to support JS thread suspension. Testing the profiler internally on Wix’s app revealed interesting insights about the JS thread.
    • The project is still not stable enough for general use but is actively worked on, and we hope to announce it soon.
  • React Native Navigation
    • V2 development pace has been increased substantially, up until now, we only had 1 developer working on it 20% of his time, we now have 3 developers working on it full time!
  • Android Performance
    • Replacing the old JSCore bundled in RN with its newest version (tip of webkitGTK project, with custom JIT configuration) produced 40% performance increase on the JS thread. Next up is compiling a 64bit version of it. This effort is based on JSC build scripts for Android. Follow its current status here.

Next sessions​

There's been some discussion on re-purposing this meeting to discuss a single and specific topic (e.g. navigation, moving React Native modules into separate repos, documentation, ...). That way we feel we can contribute the best to React Native community. It might take place in the next meeting session. Feel free to tweet what you'd like to see covered as a topic.

Tags:
  • engineering
Read article

Implementing Twitter’s App Loading Animation in React Native · React Native

Implementing Twitter’s App Loading Animation in React Native

· 11 min read

Twitter’s iOS app has a loading animation I quite enjoy.

Once the app is ready, the Twitter logo delightfully expands, revealing the app.

I wanted to figure out how to recreate this loading animation with React Native.


To understand how to build it, I first had to understand the difference pieces of the loading animation. The easiest way to see the subtlety is to slow it down.

There are a few major pieces in this that we will need to figure out how to build.

  1. Scaling the bird.
  2. As the bird grows, showing the app underneath
  3. Scaling the app down slightly at the end

It took me quite a while to figure out how to make this animation.

I started with an incorrect assumption that the blue background and Twitter bird were a layer on top of the app and that as the bird grew, it became transparent which revealed the app underneath. This approach doesn’t work because the Twitter bird becoming transparent would show the blue layer, not the app underneath!

Luckily for you, dear reader, you don’t have to go through the same frustration I did. You get this nice tutorial skipping to the good stuff!


The right way​

Before we get to code, it is important to understand how to break this down. To help visualize this effect, I recreated it in CodePen (embedded in a few paragraphs) so you can interactively see the different layers.

There are three main layers to this effect. The first is the blue background layer. Even though this seems to appear on top of the app, it is actually in the back.

We then have a plain white layer. And then lastly, in the very front, is our app.


The main trick to this animation is using the Twitter logo as a mask and masking both the app, and the white layer. I won’t go too deep on the details of masking, there are plenty of resources online for that.

The basics of masking in this context are having images where opaque pixels of the mask show the content they are masking whereas transparent pixels of the mask hide the content they are masking.

We use the Twitter logo as a mask, and having it mask two layers; the solid white layer, and the app layer.

To reveal the app, we scale the mask up until it is larger than the entire screen.

While the mask is scaling up, we fade in the opacity of the app layer, showing the app and hiding the solid white layer behind it. To finish the effect, we start the app layer at a scale > 1, and scale it down to 1 as the animation is ending. We then hide the non-app layers as they will never be seen again.

They say a picture is worth 1,000 words. How many words is an interactive visualization worth? Click through the animation with the “Next Step” button. Showing the layers gives you a side view perspective. The grid is there to help visualize the transparent layers.

Now, for the React Native​

Alrighty. Now that we know what we are building and how the animation works, we can get down to the code — the reason you are really here.

The main piece of this puzzle is MaskedViewIOS, a core React Native component.

import {MaskedViewIOS} from 'react-native';

<MaskedViewIOS maskElement={<Text>Basic Mask</Text>}>
<View style={{backgroundColor: 'blue'}} />
</MaskedViewIOS>;

MaskedViewIOS takes props maskElement and children . The children are masked by the maskElement . Note that the mask doesn’t need to be an image, it can be any arbitrary view. The behavior of the above example would be to render the blue view, but for it to be visible only where the words “Basic Mask” are from the maskElement . We just made complicated blue text.

What we want to do is render our blue layer, and then on top render our masked app and white layers with the Twitter logo.

{
fullScreenBlueLayer;
}
<MaskedViewIOS
style={{flex: 1}}
maskElement={
<View style={styles.centeredFullScreen}>
<Image source={twitterLogo} />
</View>
}>
{fullScreenWhiteLayer}
<View style={{flex: 1}}>
<MyApp />
</View>
</MaskedViewIOS>;

This will give us the layers we see below.

Now for the Animated part​

We have all the pieces we need to make this work, the next step is animating them. To make this animation feel good, we will be utilizing React Native’s Animated API.

Animated lets us define our animations declaratively in JavaScript. By default, these animations run in JavaScript and tell the native layer what changes to make on every frame. Even though JavaScript will try to update the animation every frame, it will likely not be able to do that fast enough and will cause dropped frames (jank) to occur. Not what we want!

Animated has special behavior to allow you to get animations without this jank. Animated has a flag called useNativeDriver which sends your animation definition from JavaScript to native at the beginning of your animation, allowing the native side to process the updates to your animation without having to go back and forth to JavaScript every frame. The downside of useNativeDriver is you can only update a specific set of properties, mostly transform and opacity . You can’t animate things like background color with useNativeDriver , at least not yet — we will add more over time, and of course you can always submit a PR for properties you need for your project, benefitting the whole community 😀.

Since we want this animation to be smooth, we will work within these constraints. For a more in depth look at how useNativeDriver works under the hood, check out our blog post announcing it.

Breaking down our animation​

There are 4 components to our animation:

  1. Enlarge the bird, revealing the app and the solid white layer
  2. Fade in the app
  3. Scale down the app
  4. Hide the white layer and blue layer when it is done

With Animated, there are two main ways to define your animation. The first is by using Animated.timing which lets you say exactly how long your animation will run for, along with an easing curve to smooth out the motion. The other approach is by using the physics based apis such as Animated.spring . With Animated.spring , you specify parameters like the amount of friction and tension in the spring, and let physics run your animation.

We have multiple animations we want to be running at the same time which are all closely related to each other. For example, we want the app to start fading in while the mask is mid-reveal. Because these animations are closely related, we will use Animated.timing with a single Animated.Value .

Animated.Value is a wrapper around a native value that Animated uses to know the state of an animation. You typically want to only have one of these for a complete animation. Most components that use Animated will store the value in state.

Since I’m thinking about this animation as steps occurring at different points in time along the complete animation, we will start our Animated.Value at 0, representing 0% complete, and end our value at 100, representing 100% complete.

Our initial component state will be the following.

state = {
loadingProgress: new Animated.Value(0),
};

When we are ready to begin the animation, we tell Animated to animate this value to 100.

Animated.timing(this.state.loadingProgress, {
toValue: 100,
duration: 1000,
useNativeDriver: true, // This is important!
}).start();

I then try to figure out a rough estimate of the different pieces of the animations and the values I want them to have at different stages of the overall animation. Below is a table of the different pieces of the animation, and what I think their values should be at different points as we progress through time.

The Twitter bird mask should start at scale 1, and it gets smaller before it shoots up in size. So at 10% through the animation, it should have a scale value of .8 before shooting up to scale 70 at the end. Picking 70 was pretty arbitrary to be honest, it needed to be large enough that the bird fully revealed the screen and 60 wasn’t big enough 😀. Something interesting about this part though is that the higher the number, the faster it will look like it is growing because it has to get there in the same amount of time. This number took some trial and error to make look good with this logo. Logos / devices of different sizes will require this end-scale to be different to ensure the entire screen is revealed.

The app should stay opaque for a while, at least through the Twitter logo getting smaller. Based on the official animation, I want to start showing it when the bird is mid way through scaling it up and to fully reveal it pretty quickly. So at 15% we start showing it, and at 30% through the overall animation it is fully visible.

The app scale starts at 1.1 and scales down to its regular scale by the end of the animation.

And now, in code.​

What we essentially did above is map the values from the animation progress percentage to the values for the individual pieces. We do that with Animated using .interpolate . We create 3 different style objects, one for each piece of the animation, using interpolated values based off of this.state.loadingProgress .

const loadingProgress = this.state.loadingProgress;

const opacityClearToVisible = {
opacity: loadingProgress.interpolate({
inputRange: [0, 15, 30],
outputRange: [0, 0, 1],
extrapolate: 'clamp',
// clamp means when the input is 30-100, output should stay at 1
}),
};

const imageScale = {
transform: [
{
scale: loadingProgress.interpolate({
inputRange: [0, 10, 100],
outputRange: [1, 0.8, 70],
}),
},
],
};

const appScale = {
transform: [
{
scale: loadingProgress.interpolate({
inputRange: [0, 100],
outputRange: [1.1, 1],
}),
},
],
};

Now that we have these style objects, we can use them when rendering the snippet of the view from earlier in the post. Note that only Animated.View , Animated.Text , and Animated.Image are able to use style objects that use Animated.Value .

const fullScreenBlueLayer = (
<View style={styles.fullScreenBlueLayer} />
);
const fullScreenWhiteLayer = (
<View style={styles.fullScreenWhiteLayer} />
);

return (
<View style={styles.fullScreen}>
{fullScreenBlueLayer}
<MaskedViewIOS
style={{flex: 1}}
maskElement={
<View style={styles.centeredFullScreen}>
<Animated.Image
style={[styles.maskImageStyle, imageScale]}
source={twitterLogo}
/>
</View>
}>
{fullScreenWhiteLayer}
<Animated.View
style={[opacityClearToVisible, appScale, {flex: 1}]}>
{this.props.children}
</Animated.View>
</MaskedViewIOS>
</View>
);

Yay! We now have the animation pieces looking like we want. Now we just have to clean up our blue and white layers which will never be seen again.

To know when we can clean them up, we need to know when the animation is complete. Luckily where we call, Animated.timing , .start takes an optional callback that runs when the animation is complete.

Animated.timing(this.state.loadingProgress, {
toValue: 100,
duration: 1000,
useNativeDriver: true,
}).start(() => {
this.setState({
animationDone: true,
});
});

Now that we have a value in state to know whether we are done with the animation, we can modify our blue and white layers to use that.

const fullScreenBlueLayer = this.state.animationDone ? null : (
<View style={[styles.fullScreenBlueLayer]} />
);
const fullScreenWhiteLayer = this.state.animationDone ? null : (
<View style={[styles.fullScreenWhiteLayer]} />
);

Voila! Our animation now works and we clean up our unused layers once the animation is done. We have built the Twitter app loading animation!

But wait, mine doesn’t work!​

Don’t fret, dear reader. I too hate when guides only give you chunks of the code and don’t give you the completed source.

This component has been published to npm and is on GitHub as react-native-mask-loader. To try this out on your phone, it is available on Expo here:

More Reading / Extra Credit​

  1. This gitbook is a great resource to learn more about Animated after you have read the React Native docs.
  2. The actual Twitter animation seems to speed up the mask reveal towards the end. Try modifying the loader to use a different easing function (or a spring!) to better match that behavior.
  3. The current end-scale of the mask is hard coded and likely won’t reveal the entire app on a tablet. Calculating the end scale based on screen size and image size would be an awesome PR.
Tags:
  • engineering
Read article