React Native Shadow is Missing on iOS but is Okay on Android

In one of our projects, some of the items that had a shadow, was working just fine on Android, but the shadow was missing on iOS.

After investigating it, it turned out to be related to the items that had overflow: 'hidden', which on iOS resulted in shadow being trimmed.

Turns out on iOS, the shadow is part of the UI component that you define it at, which results in removing the shadow, when one has the overflow set to hidden. On Android, the shadow is applied outside of the component, so it is just fine to have overflow: 'hidden' and still get the shadow.

The solution was to wrap the component in another <View /> with the shadow defined in it, while having the overflow: 'hidden' in the inner component.

Example code:

// Before:
// ...
<View style={ { 
  // we need the overflow hidden to round the images in the content
  overflow: 'hidden',
  borderRadius: 20,
  
  // shadow definition
  shadowColor: '#000',
  shadowOffset: {
    width: 0,
    height: 2,
  },
  shadowOpacity: 0.25,
  shadowRadius: 3.84,
  elevation: 5,  
} }>
  { children }
</View>

// After:
<View style={ { 
  // we still need the same radius, so the shadow would have the same shape as
  // the inner container

  borderRadius: 20,
  
  // shadow definition
  shadowColor: '#000',
  shadowOffset: {
    width: 0,
    height: 2,
  },
  shadowOpacity: 0.25,
  shadowRadius: 3.84,
  elevation: 5,  
} }>

  <View style={ {
    // we need the overflow hidden to round the images in the content
    overflow: 'hidden',
    borderRadius: 20,
  } }>
  { children }

  </View>
</View>

So if you end up having missing shadows on iOS, make sure to check for overflow: 'hidden' on the element : )

SentryError: Native Client is not available, can’t start on native when updating expo-cli to 4.x.x (from version 3.22.3)

TL;DR: Update your metro.config.js to use @expo/metro-config based on the latest guidelines (SDK 40+) –

It’s funny when you encounter an error in a project, and after spending a lot of effort researching it, to find out that the cause is the same with a totally different error on a very different project.

In my case this was caused by outdated metro.config.js file, specifically the SVG loading code that uses react-native-svg-transformer.

To fix it, I replaced the metro config with the following:

const { getDefaultConfig } = require("@expo/metro-config");

module.exports = (async () => {
  const {
    resolver: { sourceExts, assetExts }
  } = await getDefaultConfig(__dirname);
  return {
    transformer: {
      babelTransformerPath: require.resolve("react-native-svg-transformer")
    },
    resolver: {
      assetExts: assetExts.filter(ext => ext !== "svg"),
      sourceExts: [...sourceExts, "svg"]
    }
  };
})();

And also intall the '@expo/metro-config' module:

yarn add @expo/metro-config
// or
npm install @expo/metro-config

More info on the other error – SVG Icons Not Loaded After Updating ReactNative Expo to Version 40

Cheers!

Icons Not Loaded After Updating ReactNative Expo to Version 40

TL; DR: If you’re loading SVGs, check into your metro.config.js and see if you’re using the getDefaultConfig from '@expo/metro-config'. If you’re requiring it from 'metro-config', you should update your code based on the one below. More info in the readme ofreact-native-svg-transformer

Recently I’ve had an issue with updating a project I was working on – after updating the Expo SDK to version 40, the icons stopped working.

The project was using react-native-elements, so this was my first guess for the cause of the issue. Digging a bit deeper, it turned out that any icon from @expo/vector-icons was shown as an X in a square.

Digging through the project (and github issues) I decided to create an empty expo project and gradually include the files. Doing so, I found out that there was a custom metro.config.js that took care for loading the SVGs. Looking into the readme of the 'react-native-svg-transformer' and Eurica 🙂 from version 40 or newer, the code in the metro.config.js should be different:

const { getDefaultConfig } = require("@expo/metro-config");

module.exports = (async () => {
  const {
    resolver: { sourceExts, assetExts }
  } = await getDefaultConfig(__dirname);
  return {
    transformer: {
      babelTransformerPath: require.resolve("react-native-svg-transformer")
    },
    resolver: {
      assetExts: assetExts.filter(ext => ext !== "svg"),
      sourceExts: [...sourceExts, "svg"]
    }
  };
})();

And don’t forget to install the '@expo/metro-config' module:

yarn add @expo/metro-config
// or
npm install @expo/metro-config

That’s it and happy hacking 🙂

Messing with Expo Permissions caused ImagePicker to misbehave

Have you seen the following alert?

Sorry, we need media library permissions to make this work!

Well.. me too! 🙂 And I’ve seen this error despite the fact that I’ve had both the "CAMERA" and "MEDIA_LIBRARY" permissions added to my android.permissions array.

Googling didn’t help much, because it just led me to either the expo permissions docs or to the ImagePicker docs, from which it seemed that the needed permissions are already in.

Luckily, I started to dig through the expo GitHub issues, and found an issue by folks facing the same problem.

The solution is to add the “READ_EXTERNAL_STORAGE” and the “WRITE_EXTERNAL_STORAGE” to the permissions array, despite the docs listing them as ones that are added by default.

Updating a React Native/Expo image file does not update the visualization of this image everywhere in the app

I’ve had an interesting problem when saving and updating images in a React Native application built with Expo.

I’m building an app that has contacts and images (that are either taken from the phone contact entry or picked from the gallery).

The issue was that editing the image at one place and saving it, would not update the contact image in the contacts list. When updating the image, I was updating the image file and overriding it in the filesystem.

After saving it, and going to the previous screen, the old image was still there. Only after refreshing the application it was replaced.

Since I was reusing the file name, the prop in the contact card was not modified (the file path was the same), so the component didn’t know it had to re-render.

To solve that, I decided to update my helper function to add a timestamp to the filename. This way the file path would change, forcing all the components with the image to re-render.

export async function persistCachedFile ( cachedFile: string, permanentFolder: string, fileId: string ) {
    const permanentDirectoryPath = `${ FileSystem.documentDirectory }${ permanentFolder }/`
    const uniqueFilePath = `${ permanentDirectoryPath }${ fileId }-${ Date.now() }`;

    await ensureDirExists( permanentDirectoryPath );

    await FileSystem.copyAsync( {
        from: cachedFile,
        to: uniqueFilePath
    } );

    return uniqueFilePath;
}

The downside here is, that the old files are forever going to stay in the app directory. To avoid that, we need to add a cleanup function. I came up with something the following function that runs each time we copy the file.

export async function cleanupOldFilesAsync ( folder: string, fileId: string ) {
    // Finbd all files that have the imageId in their file name (and delete then):
    const directoryFiles = await FileSystem.readDirectoryAsync( folder );
    const previousImages = directoryFiles.filter( file => {

        if ( file.includes( fileId ) ) {
            return true;
        }
        return false;
    } );

    // Delete previous images.
    if ( previousImages.length ) {
        previousImages.forEach( previousImage => {
            // We don't await, because removing the files is not critical
            deleteAsync( `${ folder }${ previousImage }` );
        } )
    }
}

Now call the cleanupOldFiles from persistCachedFile (before we store the updated file) and voilà : )

The end result is:

import {
	deleteAsync,
	getInfoAsync,
	makeDirectoryAsync,
	readDirectoryAsync,
	copyAsync,
	documentDirectory
} from 'expo-file-system';

export async function ensureDirExists ( directory: string ) {
	const dirInfo = await getInfoAsync( directory );
	if ( !dirInfo.exists ) {
		await makeDirectoryAsync( directory, { intermediates: true } );
	}
}


export async function cleanupOldFilesAsync ( folder: string, fileId: string ) {
	// Finbd all files that have the imageId in their file name (and delete then):
	const directoryFiles = await FileSystem.readDirectoryAsync( folder );
	const previousImages = directoryFiles.filter( file => {

		if ( file.includes( fileId ) ) {
			return true;
		}
		return false;
	} );

	// Delete previous images.
	if ( previousImages.length ) {
		previousImages.forEach( previousImage => {
			// We don't await, because removing the files is not critical
			deleteAsync( `${ folder }${ previousImage }` );
		} )
	}
}

export async function persistCachedFile ( cachedFile: string, permanentFolder: string, fileId: string ) {
    const permanentDirectoryPath = `${ FileSystem.documentDirectory }${ permanentFolder }/`
    const filePath = `${ permanentDirectoryPath }${ fileId }`;
    const uniqueFilePath = `${ filePath }-${ Date.now() }`;

    cleanupOldFilesAsync( permanentDirectoryPath, fileId )
    await ensureDirExists( permanentDirectoryPath );

    await FileSystem.copyAsync( {
        from: cachedFile,
        to: uniqueFilePath
    } );

    return uniqueFilePath;
}

Android Emulator Losing Internet Connectivity

Spoiler: When having Internet connectivity issues, make sure that Android Studio is running. (because it acts as a proxy for the emulator)

So I’ve had a reoccurring issue where my Android Emulator device would lose connectivity from time to time, and to fix it I went to delete it and create a new device from the Android Virtual Device Manager (AVD). Unfortunately, this was only temporary and the issue would appear again at a point.

After happening again, I decided to try to debug it instead of starting from scratch. It turned out, that in order to have an internet connection, the Android Studio should be started. The reason for losing the connectivity was because I was sometimes closing the Android Studio in order to decrease the load on my PC (I’m working with React Native and expo, and don’t use the Android Studio much).

So before trying something fancy from the SO answers to the “Android emulator not able to access the internet” question, make sure that Android Studio is running 🙂

Wondering how big is your React project?

Recently I wanted to explore the impact of a refactoring I did so I ended up checking the length of the files I’ve had in git (hoping that this number would decrease).

The magic command was to list the files in git, grep the ones ending with js, jsx and css and count the lines.

git ls-files | grep -P ".*(js|jsx|css)$" | xargs wc -l

This works both on Linux and Windows with Git Bash

Side note: Feel free to remove/add/change extentions in the (js|jsx|css) to check your source files.

Thanks to this Stack Overflow question 🙂

AWS Closes S3 Read Stream Unexpectedly

I’m continuing with my notes on transferring big files from and to AWS S3 with node.js

If you are reading a file from a S3 bucket using a stream that you occasionally pause, mind that the read stream will be closed in 60 minutes.

If you cannot handle the file in that period of time, you’d receive a ‘data’ and an ‘end’ event, even though you didn’t finish processing the file.

One possible solution here is to download the file before starting the import, process it and delete it once we don’t need it any more.

//So instead of:
const s3Stream = s3.getObject( params ).createReadStream();
const csvStream = fastCsv.fromStream( s3Stream, csvParams );
/* Do your processing of the csvStream */


// Store your file to the file system
const s3Stream = s3.getObject( params ).createReadStream();
const localFileWriteStream = fs.createWriteStream( path.resolve( 'tmp' , 'big.csv' ) );
s3Stream.pipe( localFileWriteStream );

localFileWriteStream .on( 'close', () => {
    const localReadStream = fs.createReadStream( path.resolve( 'tmp', 'big.csv' ) );

    const csvStream = fastCsv.fromStream( localReadStream , csvParams );

    csvStream.on( 'data', ( data ) => {
        /* Do your processing of the csvStream */
    });

    csvStream.on( 'end', () => {
        // Delete the tmp file
        fs.unlink( path.resolve( 'tmp', 'big.csv' ) );
    });
);

Node.js Streams and why sometimes they don’t pause()

TL; DR: 

If you pipe your node.js streams, make sure you pause the last one in the chain.

const stream1 = s3.getObject( params ).createReadStream();
const stream2 = fasctCsv.fromStream( stream1 ); // This makes piping behind the scenes.
// If you want to pause the streams, pause the last in the chain
stream2.pause();

The longer story:

We’re building a node.js application that ingests data from multiple data sources for a client of ours. Since they are quite big in size and in user base (we’re going to process data for ~50 M users from tens of systems), the ingested files (CSV) are also relatively big in size (several GB)


We’re using AWS S3 as the glue for the data – the systems are uploading their data there and we’re monitoring for new data to ingest. We’re using the aws-sdk node package to read them as streams, parse them using fastCsv and create audit log and snapshots for each of the user in a PG database.

We are batching the inserts and are pausing the data stream for each of the batches, so we don’t end up with a back-pressure problem.

While testing the ingestion of the big files we noticed something peculiar. We thought we’ve paused the stream but it continued to push data like the .pause() was not invoked.

The mistake we made turned out to be quite common if you work with streams – we called the .pause() method of the s3stream, which we have pipe()-d to another stream – the fastCsv one. In this scenario when the fastCsv stream drained, it called the resume method of the s3stream.

In order to pause the streams, one must pause() the last piped one (in our case the fastCsv one)

More on back-pressure: Together with the research about our issue I found a very extensive article about back-pressure in node.js