Recording Audio In a NativeScript App

Okay, so not everyone will need to record audio from a device in their mobile application, but it’s still a pretty kick-ass feature to add to your app. This tutorial will be short and sweet. We are going to use the nativescript-audio plugin. You can find the repo here. A pal of mine, Nathan Walker, contributed the iOS version and did a lot of the TypeScript cleanup on the code base so thanks to him for his contribution. This plugin actually records and plays back audio. I plan on doing a follow up on playing audio later on.

So let’s get started with the code. We are going to create a fresh NativeScript app, add the plugin and record some audio. If you aren’t familiar with NativeScript, there is an excellent getting started guide. We will stick to Android for this tutorial but the process will be the same for iOS since the plugin handles all of the native work for you.

Important Android Reminder:

Recording audio requires permission, on devices running Android API <= 22 you just need to add the permission to the AndroidManifest.xml. The permission is:

<uses-permission android:name="android.permission.RECORD_AUDIO" />

If you’re targeting API 22 or lower, then you are done. However, as of now you should be targeting API 23 which handles certain permissions at run time instead of during the installation. For this we are going to use Nathanael Anderson’s plugin nativescript-permissions.

Open your command prompt on Windows or the terminal on Mac/Linux. Then run the following commands:

tns create AudioRecorder

cd AudioRecorder

tns platform add android

Next we need to add our plugins, execute the following command:

npm i nativescript-audio
npm i nativescript-permissions

Now we have a new NativeScript app with the Android platform and our plugin installed. Now for the UI we will just add three buttons to start and stop recording and see the audio file path, then wire up the functions to execute the correct methods on the plugin. Very simple stuff (that’s the purpose of this tutorial).

Open up your IDE or text editor and let’s change app/main-page.xml to look like the following:

<Page xmlns="" navigatingTo="onNavigatingTo">
<ActivityIndicator color="#3489db" busy="{{ isRecording }}" />
<Button text="Start Recording" tap="start" />
<Button text="Stop Recording" tap="stop" />
<Button text="Get Recorded File" tap="getFile" />
<label text="{{ recordedAudioFile }}" color="#3489db" textWrap="true" />



Next, we need the code for the start and stop functions. Open up the app/main-page.js file and change it to the following:

var observable = require("data/observable");
var fs = require('file-system');
var audio = require("nativescript-audio");
var permissions = require('nativescript-permissions');

var data = new observable.Observable({});
var recorder;

function onNavigatingTo(args) {
   var page = args.object;
   page.bindingContext = data;

   data.set('isRecording', false);
exports.onNavigatingTo = onNavigatingTo;


function start(args) {

   permissions.requestPermission(android.Manifest.permission.RECORD_AUDIO, "Let me hear your thoughts...")
 .then(function () {

   // you should check if the device has recording capabilities
   if (audio.TNSRecorder.CAN_RECORD()) {

     recorder = new audio.TNSRecorder();

     var audioFolder = fs.knownFolders.currentApp().getFolder("audio");

     var recorderOptions = {

       filename: audioFolder.path + '/recording.mp3',
       infoCallback: function () {
       errorCallback: function () {
          alert('Error recording.');

    console.log('RECORDER OPTIONS: ' + recorderOptions);

    recorder.start(recorderOptions).then(function (res) {
       data.set('isRecording', true);
    }, function (err) {
        data.set('isRecording', false);
        console.log('ERROR: ' + err);

   } else {
     alert('This device cannot record audio.');

   .catch(function () {
      console.log("Uh oh, no permissions - plan B time!");
exports.start = start;


function stop(args) {
   if (recorder != undefined) {
     recorder.stop().then(function () {
     data.set('isRecording', false);
     alert('Audio Recorded Successfully.');
   }, function (err) {
     data.set('isRecording', false);
exports.stop = stop;

function getFile(args) {
 try {
    var audioFolder = fs.knownFolders.currentApp().getFolder("audio");
    var recordedFile = audioFolder.getFile('recording.mp3');
    data.set("recordedAudioFile", recordedFile.path);
  } catch (ex) {
exports.getFile = getFile;

That’s it. You are now ready to build the app and run it on your device or emulator. One last warning, some emulators limit device capability, I suggest GenyMotion over the stock Android emulator. Run the following NativeScript commands to view your work:

tns build android

tns run android

Once the app is built, and you run it, your emulator or attached device will load up the app, press the Start Recording button, you should see an Activity Indicator show (if on API 23, you’ll need to grant the record audio permission) and now you are recording audio. Stop recording when you are done.


You’ve now created a native audio file in your NativeScript app. In the next tutorial, I’ll go over how to get the file and play it back using the same nativescript-audio plugin. As well as playing remote and other local audio files.

18 thoughts on “Recording Audio In a NativeScript App

    1. Have to look into it. Are you handling the permissions correctly? Android 6.0+ you gotta do runtime permissions so it might fail there. If that’s not it, I can check later next week


  1. Brad, sorry to bug you and again thanks for the work you’ve done here. I was easily able to incorp your component to achieve what I needed in the Android enviro. But I was now trying to get things setup so I could record in iOS to a format that will render directly in a browser once sent to a server without any conversions on the backend. In Android I’m using .wav files so there is no issue. So I was wondering if you could point me in the right direction with regards to format/audio options etc. in order to achieve this.


    1. Forgot to circle back to this. Did you ever figure anything out? If not you should join the new nativescript forum and ask. There’s some brilliant people there and ping me on there and we can help sort through this if you still need help.


  2. Brad,
    Nice example but I am getting error on recorder.start(recorderOptions). Thanks in advance.
    java.lang.RuntimeException: start failed.
    JS: Method)


  3. Got an exception :

    An uncaught Exception occurred on “main” thread.
    java.lang.RuntimeException: Unable to start activity ComponentInfo{org.nativescript.AudioRecorder/com.tns.NativeScriptActivity}: com.tns.NativeScriptException: Failed to find module: “nativescript-audio”, relative to: app/tns_modules/
    at android.os.Handler.dispatchMessage(
    at android.os.Looper.loop(
    at java.lang.reflect.Method.invoke(Native Method)
    Caused by: com.tns.NativeScriptException: Failed to find module: “nativescript-audio”, relative to: app/tns_modules/
    at com.tns.Module.resolvePathHelper(
    at com.tns.Module.resolvePath(
    at com.tns.Runtime.callJSMethodNative(Native Method)
    at com.tns.Runtime.dispatchCallJSMethodNative(
    at com.tns.Runtime.callJSMethodImpl(
    at com.tns.Runtime.callJSMethod(
    at com.tns.Runtime.callJSMethod(
    at com.tns.Runtime.callJSMethod(
    at com.tns.NativeScriptActivity.onCreate(
    … 9 more


      1. Hi and thank you for your reply. When I added audio plugin it had a warning asking for a peer of a lower version of tns-core-modules. I removed the original 3.2.0 and installed 3.0.0 but same warning was there.
        What’s the link to your another post?


  4. Yea you’re right it should. I see why it’s not. Don’t use npm install with nativescript. While it should work in most cases. The correct CMD is `tns plugin add` bc of the internals of NS which should still use npm under the hood.


      1. It would be `”nativescript-audio”: “^4.0.3″` you might have a node, npm, or TNS issue if that’s not modifying your project’s package.json though. Very odd.


    1. I created a new project with sidekick, added the plugins and manually added them to package.json and it worked (tested on emulator). I don´t know what was happening but now I built android after all these steps. Maybe it must be done this way.


      1. Definitely not the only way 🙂 you have an issue on your setup or machine that caused installation issues. I have used the plugin many times from the CLI with no issues so I’m certain something was wrong with your setup. Any other plugins work when you add the via the CLI to that project where audio didn’t work? If so it might help isolate the issue some for your setup.


  5. hey brad

    its not working on ios I always get the same error with TNSRecorder.start() method :

    CONSOLE LOG file:///app/tns_modules/nativescript-audio/src/ios/recorder.js:26:32: setCategoryError: null

    any help would be appreciated



  6. Unfortunately I am facing this every time
    JS: Uh oh, no permissions – plan B time!
    I tried by adding permission in android.manifest manually as well and using nativescript-permission as well
    Help required


Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s