陈斌彬的技术博客

Stay foolish,stay hungry

iOS - AVAudioPlayer 播放本地音频

Introduction

The AV Foundation (Audio and Video Foundation) framework in the iOS SDK allows developers to play and/or record audio and video with ease. In addition, the Media Player framework allows developers to play audio and video files. Before you can run the code in this chapter, you must add the AVFoundation.frame work and MediaPlayer.framework frameworks to your Xcode project. With the new LLVM compiler, all you have to do in order to include these frameworks into your app is to import their umbrella header files into your app like so:

#import "AppDelegate.h"
#import <AVFoundation/AVFoundation.h>

#import <MediaPlayer/MediaPlayer.h>
@implementation AppDelegate
<# Rest of your app delegate code goes here #>

Playing Audio Files

Solution

Use the AV Foundation framework’s AVAudioPlayer class.

Discussion

The AVAudioPlayer class in the AV Foundation framework can play back all audio formats supported by iOS. The delegate property of an instance of AVAudioPlayer allows you to get notified by events, such as when the audio playback is interrupted or when an error occurs as a result of playing an audio file. Let’s have a look at a simple example that demonstrates how we can play an audio file from the application’s bundle:

- (void)audioPlayerDidFinishPlaying:(AVAudioPlayer *)player successfully:(BOOL)flag{

NSLog(@"Finished playing the song");
/* The [flag] parameter tells us if the playback was successfully
         finished or not */

if ([player isEqual:self.audioPlayer]){ self.audioPlayer = nil;

}else{
/* Which audio player is this? We certainly didn't allocate
             this instance! */
   }
}
- (void)viewDidLoad { [super viewDidLoad];
dispatch_queue_t dispatchQueue = dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0);
dispatch_async(dispatchQueue, ^(void) {
NSBundle *mainBundle = [NSBundle mainBundle];

            NSString *filePath = [mainBundle pathForResource:@"MySong"

                                                      ofType:@"mp3"];

NSData *fileData=[NSDatadataWithContentsOfFile:filePath]; NSError *error = nil;
            /* Start the audio player */

            self.audioPlayer = [[AVAudioPlayer alloc] initWithData:fileData

                                                             error:&error];

            /* Did we get an instance of AVAudioPlayer? */

if (self.audioPlayer != nil){
/* Set the delegate and start playing */ self.audioPlayer.delegate = self;
if ([self.audioPlayer prepareToPlay] &&
[self.audioPlayer play]){
/* Successfully started playing */

}else{

/* Failed to play */

} }else{
                /* Failed to instantiate AVAudioPlayer */

} });
}

As you can see, the file’s data is loaded into an instance of NSData and then passed on to AVAudioPlayer ’s initWithData:error: method. Because we need the actual, absolute path of the MP3 file to extract the data from that file, we invoke the mainBundle class method of NSBundle to retrieve the information from the application’s configuration. The pathForResource:ofType: instance method of NSBundle can then be used to retrieve the absolute path to a resource of a specific type, as demonstrated in the example code.

The audioPlayerDidFinishPlaying:successfully: delegate method of the audio player will get called on the delegate object of the player whenever, as the method’s name indicates, the audio player finishes playing the audio file. Now, this does not necessarily mean that the audio playback was finished after the whole audio file was finished playing. There could have been an interruption—for instance, the audio channel may have gotten occupied by another app that came to the foreground, causing your app to stop playing. In this case, the aforementioned method gets called. This is a great place to release your audio player if you no longer need it.

In the viewDidLoad method, we are using GCD to asynchronously load the song’s data into an instance of NSData and use that as a feed to the audio player. We do this because loading the data of an audio file can take a long time (depending on the length of the audio file), and if we do this on the main thread, we run the risk of stalling the UI experience. Because of this, we are using a global concurrent queue to ensure that the code does not run on the main thread.

Since we are assigning the instance of AVAudioPlayer to a property named audioPlayer, we must also see how this property is defined:

#import "ViewController.h"

#import <AVFoundation/AVFoundation.h>

@interface ViewController () <AVAudioPlayerDelegate> @property (nonatomic, strong) AVAudioPlayer *audioPlayer; @end

@implementation ViewController

As you can see, we have made the view controller the delegate of the audio player. This way, we can receive messages from the system whenever the audio player, for instance,is interrupted or has finished playing the song. With this information in hand, we can make appropriate decisions in the application, such as starting to play another audio file.

img img