5 steps for building audio player in Flutter

MacBook Pro - 1 (3)

Dear Flutter fan 👋

In this article I’ll show you a process of building a simple audio player application in Flutter. I will be referring to some Flutter concepts (Providers, BLoC) as well as software engineering patterns like the Repository Pattern. If you’re not familiar with these concepts, take a moment to read up on them before going further in this article. They’ll come in real handy. 

The key elements covered in this article are:

  • playing audio files in both offline and online
  • fitting the audio player in a scalable and maintainable architecture implementation
  • seamless support for listening in both foreground and background
app

I wrote this article to help you streamline the application creation process. I also included clean architecture, user experience, and risk management tips.

Sounds like a familiar challenge to tackle? Good. If you’re not willing to spend time on deeper research, you can follow these 5 practical steps and apply this process in your application.

1. Select the right plugin that meets your app’s requirements

Flutter has been a fast-growing framework in recent months. That means whenever you need to handle an advanced functionality, it’s good practice to start from researching the available plugins.

Even when I implement a functionality I used to build myself in the past, I do this kind of  research one more time when I start a new project.

There are a couple of reasons for that:

  • There might be a newer plugin that has better performance
  • Existing tools may be improved with enhanced capabilities, or documentation that lowers the barriers to entry
  • On the other hand, existing tools may have been deprecated, or are not being actively maintained

When I did the audio players plugins research, I set clear requirements for the tool I was looking for:

  • Support for both Android and iOS (Web was not a requirement in my case)
  • Playing tracks from both offline and online media sources
  • Seamless support for the listening experience in both foreground and background

Then I followed my derisking process:

  1. I shortlisted the available plugins found at https://pub.dev/ for the “audio player” keyword.
  2. I checked its Installing and Example section. I needed to understand how quick I would be able to achieve “my first win”.
  3. I googled for Flutter community articles, or mentions, about the topic and selected plugins. It’s an additional validation of the potential solution’s maturity. It’s a plus if there is a comparable experience of using it in already released applications.
  4. I asked the Flutter community members for their experience with the audio playing functionality.
plugin-selection-process

It may sound like a long and complex activity. In fact, it may take you 2 – 4 hours to prepare a thoroughly-researched list of 2 – 3 plugins. That increases the probability that it would correctly support your functionality. Additionally, not doing the research can cost you countless hours – you may realize the tool does not fit your needs and you have to start from scratch.

The next step was to create a tiny application using the plugin, based on the “Get Started” snippet codes. We can call it a Proof of Concept (PoC) of our application.

When I was going through the plugin selection process for the first time, I selected the Assets Audio Players plugin, created by Florent Champigny. I managed to build a PoC application within one hour that met all my functionality requirements.

However, I strongly recommend you to perform your own plugin selection process as it might evolve as you’re reading this.

2. Build a maintainable architecture for your solution

Taking care of clean application architecture from Day 1 allows you to keep the codebase maintainable and to iterate fast. Hence, you need to understand what elements your structure would need for the audio playing feature, even for the MVP of your product.

I identified the following requirements for the audio player feature:

  • The app can play audio assets stored locally
  • The app can integrate online audio sources with ease
  • The app can easily handle audio player in both foreground and background

The goal is to integrate the selected plugin into your architecture concept. I decided to build an architecture with the help of the BLoC pattern and Provider library. ResoCoder created great tutorials about both topics.

1.Define our audio model – AudioPlayerModel

AudioPlayerModel keeps all the necessary information, including Audio data that AssetsAudioPlayer uses for playing audio. We would also like to enhance our model with value equality capabilities, thanks to Equatable. You can find more information in Felix Angelov’s article about it.

class AudioPlayerModel extends Equatable {

  final String id;
  final Audio audio;
  final bool isPlaying;

  const AudioPlayerModel({this.id, this.audio, this.isPlaying});

  @override
  List<Object> get props => [this.id, this.isPlaying];

  @override
  bool get stringify => true;
}

2.Create AudioPlayerRepository abstraction

Our AudioPlayerRepository lets us get either one or all the available audio player models. I used repository abstraction to achieve more flexibility in audio player model sourcing.

abstract class AudioPlayerRepository {
  Future<AudioPlayerModel> getById(String audioPlayerId);
  Future<List<AudioPlayerModel>> getAll();
}

As a result we can provide both offline and online audio sources. In my example I considered only offline audio storage, kept in memory.

class InMemoryAudioPlayerRepository implements AudioPlayerRepository {

  final List<AudioPlayerModel> audioPlayerModels;

  InMemoryAudioPlayerRepository({this.audioPlayerModels});

    @override
  Future<AudioPlayerModel> getById(String audioPlayerId) async {
    return Future.value(audioPlayerModels.firstWhere((model) => model.id == audioPlayerId));
  }

    @override
  Future<List<AudioPlayerModel>> getAll() async {
    return Future.value(audioPlayerModels);
  }
}

Here is an exemplary way of providing AudioPlayerModel list into the repository:

class AudioPlayerModelFactory {

  static List<AudioPlayerModel> getAudioPlayerModels() {
    return [
      AudioPlayerModel(
          id: "1",
          unlocked: false,
          isPlaying: false,
          audio: Audio(
              "assets/audios/my_country_song.mp3",
              metas: Metas(
                id: "1",
                title: "My Country Song",
                artist: "Joe Doe",
                album: "Country Album",
                image: MetasImage.asset("assets/images/country_image.png"),
              )
          )
      )]

Having the repository abstraction defined, we can easily replace offline audio source with i.e. SQL database.

3.Create an AudioPlayerBloc class that handles all player and repository operations

class AudioPlayerBloc extends Bloc<AudioPlayerEvent, AudioPlayerState> {

  final AssetsAudioPlayer assetsAudioPlayer;
  final AudioPlayerRepository audioPlayerRepository;

  List<StreamSubscription> playerSubscriptions = new List();

  AudioPlayerBloc({this.assetsAudioPlayer, this.audioPlayerRepository}) {
    playerSubscriptions.add(
        assetsAudioPlayer.playerState.listen((event) {
          _mapPlayerStateToEvent(event);
        }));
  }

    @override
  Stream<AudioPlayerState> mapEventToState(AudioPlayerEvent event) async* {

    if (event is AudioPlayed) {
      yield* _mapAudioPlayedToState(event);
    }
    if (event is AudioPaused) {
      yield* _mapAudioPausedToState(event);
    }
    if (event is AudioStopped) {
      yield* _mapAudioStoppedToState();
    }
    }

    ...

    @override
  Future<Function> close() {
    playerSubscriptions.forEach((subscription) {
      subscription.cancel();
    });
    return assetsAudioPlayer.dispose();
  }

    ...
}

We provide both an AssetsAudioPlayer and AudioPlayerRepository in our BLoC constructor. The former can be created with AssetsAudioPlayer.newPlayer() method. Let’s consider the repository.

4.Provide repository and BLoC with Providers

To achieve AudioPlayerBloc usage flexibility and availability within different screens, I aimed for making it available within the application scope. I made use of MultiRepositoryProvider and MultiBlocProvider inside the App Widget building. It’s also a handy solution for applications with several features, as other BLoCs and repositories may be applied here.

class App extends StatelessWidget {
  @override
  Widget build(BuildContext context) {

    return MultiRepositoryProvider(
      providers: [
        RepositoryProvider<AudioPlayerRepository>(
          create: (context) => InMemoryAudioPlayerRepository(
              audioPlayerModels: AudioPlayerModelFactory.getAudioPlayerModels()
        ),

                ...
                // You can provide other repositories as well
      ],
      child: MultiBlocProvider(
        providers: [
          BlocProvider<AudioPlayerBloc>(
            create: (BuildContext context) => AudioPlayerBloc(
                assetsAudioPlayer: AssetsAudioPlayer.newPlayer(),
                audioPlayerRepository:
                    RepositoryProvider.of<AudioPlayerRepository>(context)),
          )

                ...
                // You can provide other bloc providers as well
        ],
        child: MaterialApp(
            title: "App title",
            home: MainScreen()),
      ),
    );
  }
}

 

3. Provide a listening preview in the background

In the first two steps we took care of Flutter tooling and proper code architecture. Right now we’ll focus on the user experience part.

If you’re a Flutter developer, I guess you’re also an active mobile apps user. You probably get that feeling of how an audio player should provide you with a pleasant listening experience, right?

Our user’s satisfaction is a key success factor. Let’s focus on three fundamental aspects of a seamless experience.

  1. Let the user control the audio from notifications bar

As mobile users, we tend to interact with several applications within a couple of seconds. As soon as we trigger the “Play” button in our favorite music app, it usually goes into the background immediately. However, we expect to have a handy listening preview in the notification bar.

From a codebase standpoint, we would like to show or update the notification with each audio play action. With AssetsAudioPlayer it’s defined as a showNotification parameter when opening an audio track.

await assetsAudioPlayer.open(
            audioPlayerModel.audio,
            showNotification: true
        );
  1. Let the user manage their listening experience in the background

As a user, I would like to either perform player actions in the notification bar or in the player screen inside the app. Hence, we need to handle both scenarios in our AudioPlayerBloc to adjust audio models state.

class AudioPlayerBloc extends Bloc<AudioPlayerEvent, AudioPlayerState> {

  final AssetsAudioPlayer assetsAudioPlayer;
  final AudioPlayerRepository audioPlayerRepository;

  List<StreamSubscription> playerSubscriptions = new List();

  AudioPlayerBloc({this.assetsAudioPlayer, this.audioPlayerRepository}) {
    playerSubscriptions.add(
        assetsAudioPlayer.playerState.listen((event) {
          _mapPlayerStateToEvent(event);
        }));
  }

    ...

    void _mapPlayerStateToEvent(PlayerState playerState) {
    if (playerState == PlayerState.stop) {
      add(AudioStopped());
    }

    else if (playerState == PlayerState.pause) {
      add(AudioPaused(assetsAudioPlayer.current.value.audio.audio.metas.id));
    }

    else if (playerState == PlayerState.play) {
      add(AudioPlayed(assetsAudioPlayer.current.value.audio.audio.metas.id));
    }
  }

    ...
}

When using AssetsAudioPlayer, we can subscribe to it to fetch information about its state and relevant audio model identifier. Then, we map the received player state to an appropriate BLoC event that updates the data and refreshes the layout.

  1. Handle all potential corner cases that might break the listening experience

It’s also good practice to consider all potential corner cases:

  • What would happen if somebody called our user?
  • What would happen if wireless headphones disconnected while playing?
  • Should the audio player respect the silent mode?

In AssetsAudioPlayer we can define the following scenarios when opening the audio track:

await assetsAudioPlayer.open(
              updatedModel.audio,
              showNotification: true,
              respectSilentMode: true,
              phoneCallStrategy: PhoneCallStrategy.pauseOnPhoneCallResumeAfter,
              headPhoneStrategy: HeadPhoneStrategy.pauseOnUnplug
          );

 

4. Let the user take control of listening when in the foreground

Letting the user handle the listening experience in the background is one thing. Another is to provide a consistent user experience with listening in the foreground.

As we can have multiple pages available, we would like to provide control over listening on top of each page. Hence, we can compose a Widget attached above the bottom bar.

This pattern looks similar to current music player experiences (like e.g. Spotify). We wouldn’t like to teach users completely new patterns, but let them use their developed intuition with listening experience.

Our Widget should also fetch the data from our global AudioPlayerBloc – BlocBuilder would help us achieve it. We can build a PlayerWidget with our BLoC support and ListTile widget UI usage:

class PlayerWidget extends StatelessWidget {
  const PlayerWidget({Key key}) : super(key: key);

  @override
  Widget build(BuildContext context) {
    return BlocBuilder<AudioPlayerBloc, AudioPlayerState>(
      builder: (context, state) {
        if (state is AudioPlayerReady) {
          return SizedBox.shrink();
        }
        if (state is AudioPlayerPlaying) {
          return _showPlayer(context, state.playingEntity);
        }
        if (state is AudioPlayerPaused) {
          return _showPlayer(context, state.pausedEntity);
        } else {
          return SizedBox.shrink();
        }
      },
    );
  }

  Widget _showPlayer(BuildContext context, AudioPlayerModel model) {
    return Container(
      color: Colors.grey.shade800,
      child: ListTile(
        leading: setLeading(model),
        title: setTitle(model),
        subtitle: setSubtitle(model),
        contentPadding: EdgeInsets.symmetric(vertical: 4, horizontal: 16),
        trailing: IconButton(
          icon: setIcon(model),
          onPressed: setCallback(context, model),
        ),
      ),
    );
  }

  Widget setIcon(AudioPlayerModel model) {
    if (model.isPlaying)
      return Icon(Icons.pause);
    else
      return Icon(Icons.play_arrow);
  }

  Widget setLeading(AudioPlayerModel model) {
    return new Image.asset(model.audio.metas.image.path);
  }

  Widget setTitle(AudioPlayerModel model) {
    return Text(model.audio.metas.title);
  }

  Widget setSubtitle(AudioPlayerModel model) {
    return Text(model.audio.metas.artist);
  }

  Function setCallback(BuildContext context, AudioPlayerModel model) {
    if (model.isPlaying)
      return () {
        BlocProvider.of<AudioPlayerBloc>(context)
            .add(TriggeredPauseAudio(model));
      };
    else
      return () {
        BlocProvider.of<AudioPlayerBloc>(context)
            .add(TriggeredPlayAudio(model));
      };
  }
}

Once we get the PlayerWidget, we can adapt it to each page in our application. We can stack our audio player widget with other ones using the Stack widget.

Widget buildPageWithPlayer() {
    return Stack(
      fit: StackFit.expand,
      alignment: Alignment.topCenter,
      children: <Widget>[
            ...
                // Other Widgets posted inside,
        Container(
          alignment: Alignment.bottomCenter,
          child: PlayerWidget(),
        )
      ],
    );
  }

 

5. Apply platform-specific changes

We covered everything from the perspective of Dart code. The last, but by no means least important, element is to meet platform-specific requirements, namely network and background mode policies. Without these changes our application would not work as expected.

Android

If your music player has an online source, you will need to add android.permission.INTERNET to your AndroidManifest.xml file. Android only allows you to make secured HTTPS requests by default. For HTTP calls, you need to pass explicit android:usesCleartextTraffic=”true” attribute to your manifest.

<?xml version="1.0" encoding="utf-8"?>
<manifest ...>
    <uses-permission android:name="android.permission.INTERNET" />
    <application

        <!-- If you fetch audio from HTTP requests -->
        android:usesCleartextTraffic="true"
        ...>
        ...
    </application>
</manifest>

iOS

Speaking about online-sourced music players for iOS, we would need to modify the info.plist file with both network and background mode policies.

Assuming the same HTTPS vs. HTTP use case like in Android above, we need to define NSAppTransportSecurity with NSAllowsArbitraryLoads set to true.

<key>NSAppTransportSecurity</key>
<dict>
    <key>NSAllowsArbitraryLoads</key>
    <true/>
</dict>

For enabling the app with outgoing network connections, we need to define com.apple.security.network.client set to true.

<key>com.apple.security.network.client</key>
<true/>

Finally, for proper background mode policies, we should define UIBackgroundModes with audio and fetch values. They allow, respectively, to play audio and fetch data in the background.

<key>UIBackgroundModes</key>
<array>
    <string>audio</string>
    <string>fetch</string>
</array>

 

Summary

Hey, you’ve made it! At this moment you know how to create your own music player in Flutter, based on the plugin-independent process.

We only touched the basics in this article. The topic and the codebase itself can be improved or expanded, depending on your needs!

You can find the example codebase on Github.

I would like to thank Dominik Roszkowski, Florent Champigny, Paweł Dyląg, Rafał Ociepa and ResoCoder for their contribution to this blogpost!

I’m really grateful for your attention! I’m a big fan of giving and getting constructive feedback, so if you would like to share your thoughts, feel free to contact me via Twitter!

Flutter fan,
Arek

PS. If you would like to be up to date with my latest posts, join my newsletter below!

Join the newsletter

Arek Biela

Senior Android Engineer at Revolut and Flutter Cracow Founder.

Building software products and dev communities. Following FinTech, Digital Health & AR trends.