Xinrong Guo on June 4, 2013

Learn how to create your own music visualizer!

In the mid-seventies, Atari released the Atari Home Music player that connected a television to a stereo and thereby produced abstract images in sync with the music. Consumers could manipulate the images by twisting knobs and pushing buttons on the device.

The device was a market failure but it was the first time that the world was exposed to music visualization. Now, music visualization is a common technology that can be found in almost every digital media player such as iTunes or Windows Media Player.

To see an example of music visualization in action, simply launch iTunes, start a good tune, then choose View/Show Visualizer and allow the psychedelics to free your mind! :]

In this tutorial, you’ll create your very own music visualizer. You’ll learn how to configure the project to play music as well as support background audio and to create particle effects using UIKit’s particle system. You’ll also learn how to make those particles dance to the beat of a song.

So cue up the music and break out the disco ball, things are about to get visual!

Note: You can try out most of the tutorial using the iPhone Simulator, but you will need to run the project on a device to select different songs and to play the music in the background.

Starter project

To start things off, download this starter project. The starter project has the following functionality:

  1. It provides a simple user interface for the application.
  2. The supported interface orientation is set to landscape.
  3. The MediaPlayer.framework has been added to the project.
  4. It contains a method which allows you to pick songs from your iPod library.
  5. An image named particleTexture.png was added to the project for use by the particle system.
  6. The MeterTable.h and MeterTable.cpp C++ files were also added to the project. These were taken from the Apple sample project avTouch, and will be explained later on in this tutorial.

First, extract the downloaded project, open it in Xcode, and build and run. You should see the following:

You can tap the play button to switch between play and pause modes but you won’t hear any music until after you’ve added some code. Tap on the black area in the middle to hide/show the navigation bar and tool bar.

If you’re running in the iPhone Simulator and tap the magnifying glass icon on the bottom left, you’ll see the following warning:

This is because the iPhone Simulator doesn’t support accessing the music library. But if you are running on a device, a tap on that icon will make the media picker appear, so that you can choose a song.

Once you are familiar with the user interface, let’s get started.

Let the Music Play

Using AVAudioPlayer is an easy way to play music on an iOS device. AVAudioPlayer can be found in theAVFoundation.framework, so you need to add this framework to your project.

Note: If you are interested in learning more about the AVAudioPlayer class and what it can do, take a look at our Audio 101 for iPhone Developers: Playing Audio Programatically tutorial.

Select iPodVisualizer in the Project Navigator and then select iPodVisualizer under TARGETS. Choose the Build Phases tab, expand the Link Binary With Libraries section, then click the + (plus) button.

Search for AVFoundation.framework in the pop up list, select it, and click Add. The framework should now appear in your project.

It’s time to write some code. Open ViewController.m and make the following changes:

// Add to the #imports section at the top of the file
#import <AVFoundation/AVFoundation.h>
 
// Add the following under the comment that reads "Add properties here"
@property (strong, nonatomic) AVAudioPlayer *audioPlayer;

This imports the AVFoundation.h header file so you can access AVAudioPlayer, and then adds a property that will hold the AVAudioPlayer instance your app will use to play audio.

And now, it’s time to play a music file.

The starter project includes a music file named DemoSong.m4a in the Resources folder that you can use. Feel free to use a different audio file if you’d like. Just remember, only the following audio codecs are supported on iOS devices for playback:

  • AAC (MPEG-4 Advanced Audio Coding)
  • ALAC (Apple Lossless)
  • HE-AAC (MPEG-4 High Efficiency AAC)
  • iLBC (internet Low Bitrate Codec, another format for speech)
  • IMA4 (IMA/ADPCM)
  • Linear PCM (uncompressed, linear pulse-code modulation)
  • MP3 (MPEG-1 audio layer 3)
  • µ-law and a-law

Still in ViewController.m, add the following method:

- (void)configureAudioPlayer {
    NSURL *audioFileURL = [[NSBundle mainBundle] URLForResource:@"DemoSong" withExtension:@"m4a"];
    NSError *error;
    self.audioPlayer = [[AVAudioPlayer alloc] initWithContentsOfURL:audioFileURL error:&error];
    if (error) {
        NSLog(@"%@", [error localizedDescription]);
    }
    [_audioPlayer setNumberOfLoops:-1];
}

This method creates a reference to the music file and stores it as an audioFileURL. It then create a newAVAudioPlayer instance initialized with the audioFileURL and sets its numberOfLoops property to -1 to make the audio loop forever.

Note: If you decide to use a music file other than the provided one, do remember to add the new file to the Xcode project and to change the music file name (and perhaps the extension) in the above method.

Add the following line to the end of viewDidLoad:

[self configureAudioPlayer];

By calling configureAudioPlayer: in viewDidLoad:, you set up the audio player as soon as the view loads and so are able to press the play button on app start and have the app play your song.

Now add the following line inside playPause, just after the comment that reads // Pause audio here:

[_audioPlayer pause];

Next, add the following line in the same method, just after the comment that reads // Play audio here:

[_audioPlayer play];

Tapping the play/pause button calls playPause. The code you just added tells audioPlayer to play or pause according to its current state as defined by _isPlaying. As the name indicates, this property identifies whether the audio player is currently playing audio or not.

Now build and run. If you did everything correctly the app will look exactly the same. But now you can play/pause your music.

Take this brief moment to get your funk on! :]

Selecting a Song

A music player that just plays one song, no matter how cool that song may be, isn’t very useful. So you’ll add the ability to play audio from the device’s music library.

If you don’t plan on running on a device, or know how to set that up already, you can skip to the next section.

The starter project you downloaded is set up so that when the user chooses a song from the media picker, a URL for the selected song is passed to playURL: inside ViewController.m. Currently, playURL: just toggles the icon on the play/pause button.

Inside ViewController.m, add the following code to playURL: just after the comment that reads // Add audioPlayer configurations here:

self.audioPlayer = [[AVAudioPlayer alloc] initWithContentsOfURL:url error:nil];
[_audioPlayer setNumberOfLoops:-1];

The above code is much the same as what you wrote in configureAudioPlayer. However, instead of hardcoding the filename, you create a new AVAudioPlayer instance with the URL passed into the method.

Build and run on a device, and you’ll be able to choose and play a song from your music library.

Note: If you have iTunes Match, you may see items in the media picker that are not actually on your device. If you choose a song that is not stored locally, the app dismisses the media picker and does not play the audio. So if you want to hear (and soon see) something, be sure to choose a file that’s actually there :]

While running the project on a device, press the home button. You’ll notice that your music is paused. This isn’t a very good experience for a music player application, if a music player is what you’re after.

You can configure your app so that the music will continue to play even when the app enters the background. Keep in mind that this is another feature not supported in the iPhone Simulator, so run the app on a device if you want to see how it works.

To play music in the background, you need to do two things: set the audio session category, then declare the app as supporting background execution.

First, set the audio session category.

An audio session is the intermediary between your application and iOS for configuring audio behavior. Configuring your audio session establishes basic audio behavior for your application. You set your audio session category according to what your app does and how you want it to interact with the device and the system.

Add the following new method to ViewController.m:

 - (void)configureAudioSession {
     NSError *error;
     [[AVAudioSession sharedInstance] setCategory:AVAudioSessionCategoryPlayback error:&error];
 
     if (error) {
         NSLog(@"Error setting category: %@", [error description]);
     }
 }

In configureAudioSession, you get the audio session using [AVAudioSession sharedInstance] and set its category to AVAudioSessionCategoryPlayback. This identifies that the current audio session will be used for playing back audio (as opposed to recording or processing audio).

Add the following line to viewDidLoad, just before the call to [self configureAudioPlayer];:

[self configureAudioSession];

This calls configureAudioSession to configure the audio session.

Note: To learn more about audio sessions, read Apple’s Audio Session Programming Guide. Or take a look at our Background Modes in iOS Tutorial which also covers the topic, albeit not in as much detail.

Now you have to declare that your app supports background execution.

Open iPodVisualizer-Info.plist (it’s in the Supporting Files folder), select the last line, and click the plus button to add a new item. Select Required background modes as the Key from the dropdown, and the type of the item will change to Array automatically. (If it does not automatically become Array, double check the Key.)

Expand the item, set the value of Item0 to App plays audio. (If you have a wide Xcode window, you might not notice that the value is a dropdown list. But you can access the list by simply tapping the dropdown icon at the end of the field.)

When you are done, build and run on a device, pick a song and play it, press the home button, and this time your music should continue to play without interruption even if your app is in the background.

Visualizing with Music

Your music visualizer will be based on a UIKit particle system. If you don’t know much about particle systems, you may want to read UIKit Particle Systems In iOS 5 or How To Make a Letter / Word Game with UIKit: Part 3/3 to familiarize yourself with the necessary background information; this tutorial does not go into detail explaining the particle system basics.

First, add the QuartzCore.framework to your project (the same way you added theAVFoundation.framework).

Now choose File/New/File…, and select the iOS/Cocoa Touch/Objective-C class template. Name the class VisualizerView, make it a subclass of UIView, click Next and then Create.

Select VisualizerView.m in the Xcode Project Navigator and change its extension from .m to .mm. (You can rename it by clicking the file twice slowly in the Project Navigator. That is, do not click it fast enough to be considered a double-click.) The .mm extension tells Xcode that this file needs to be compiled as C++, which is necessary because later it will access the C++ class MeterTable.

Open VisualizerView.mm and replace its contents with the following:

#import "VisualizerView.h"
#import <QuartzCore/QuartzCore.h>
 
@implementation VisualizerView {
    CAEmitterLayer *emitterLayer;
}
 
// 1
+ (Class)layerClass {
    return [CAEmitterLayer class];
}
 
- (id)initWithFrame:(CGRect)frame
{
    self = [super initWithFrame:frame];
    if (self) {
        [self setBackgroundColor:[UIColor blackColor]];
        emitterLayer = (CAEmitterLayer *)self.layer;
 
        // 2
        CGFloat width = MAX(frame.size.width, frame.size.height);
        CGFloat height = MIN(frame.size.width, frame.size.height);
        emitterLayer.emitterPosition = CGPointMake(width/2, height/2);
        emitterLayer.emitterSize = CGSizeMake(width-80, 60);
        emitterLayer.emitterShape = kCAEmitterLayerRectangle;
        emitterLayer.renderMode = kCAEmitterLayerAdditive;
 
        // 3
        CAEmitterCell *cell = [CAEmitterCell emitterCell];
        cell.name = @"cell";
        cell.contents = (id)[[UIImage imageNamed:@"particleTexture.png"] CGImage];
 
        // 4
        cell.color = [[UIColor colorWithRed:1.0f green:0.53f blue:0.0f alpha:0.8f] CGColor];
        cell.redRange = 0.46f;
        cell.greenRange = 0.49f;
        cell.blueRange = 0.67f;
        cell.alphaRange = 0.55f;
 
        // 5
        cell.redSpeed = 0.11f;
        cell.greenSpeed = 0.07f;
        cell.blueSpeed = -0.25f;
        cell.alphaSpeed = 0.15f;
 
        // 6
        cell.scale = 0.5f;
        cell.scaleRange = 0.5f;
 
        // 7
        cell.lifetime = 1.0f;
        cell.lifetimeRange = .25f;
        cell.birthRate = 80;
 
        // 8
        cell.velocity = 100.0f;
        cell.velocityRange = 300.0f;
        cell.emissionRange = M_PI * 2;
 
        // 9
        emitterLayer.emitterCells = @[cell];
    }
    return self;
}
 
@end

The above code mainly configures a UIKit particle system, as follows:

  1. Overrides layerClass to return CAEmitterLayer, which allows this view to act as a particle emitter.
  2. Shapes the emitter as a rectangle that extends across most of the center of the screen. Particles are initially created within this area.
  3. Creates a CAEmitterCell that renders particles using particleTexture.png, included in the starter project.
  4. Sets the particle color, along with a range by which each of the red, green, and blue color components may vary.
  5. Sets the speed at which the color components change over the lifetime of the particle.
  6. Sets the scale and the amount by which the scale can vary for the generated particles.
  7. Sets the amount of time each particle will exist to between .75 and 1.25 seconds, and sets it to create 80 particles per second.
  8. Configures the emitter to create particles with a variable velocity, and to emit them in any direction.
  9. Adds the emitter cell to the emitter layer.

Again, read the previously mentioned tutorials if you would like to know more about the fun things you can do with UIKit particle systems and how the above configuration values affects the generated particles.

Next open ViewController.m and make the following changes:

//Add with the other imports
#import "VisualizerView.h"
 
//Add with the other properties
@property (strong, nonatomic) VisualizerView *visualizer;

Now add the following to viewDidLoad, just before the line that reads [self configureAudioPlayer];:

self.visualizer = [[VisualizerView alloc] initWithFrame:self.view.frame];
[_visualizer setAutoresizingMask:UIViewAutoresizingFlexibleHeight | UIViewAutoresizingFlexibleWidth];
[_backgroundView addSubview:_visualizer];

This creates a VisualizerView instance that will fill its parent view and adds it to _backgroundView. (_backgroundView was defined as part of the starter project, and is just a view layered behind the music controls.)

Build and run, you will see the particle system in action immediately:

While that looks very cool indeed, you want the particles to “beat” in sync with your music. This is done by changing the size of particles when the decibel level of the music changes.

First, open VisualizerView.h and make the following changes:

//Add with the other imports
#import <AVFoundation/AVFoundation.h>
 
//Add within the @interface and @end lines
@property (strong, nonatomic) AVAudioPlayer *audioPlayer;

The new property will give your visualizer access to the app’s audio player, and hence the audio levels, but before you can use that information, you need to set up one more thing.

Switch to ViewController.m and search for setNumberOfLoops. If you skipped the section about running on the device, it will appear only once (in configureAudioPlayer); otherwise, it will appear twice (inconfigureAudioPlayer and in playURL:).

Add the following code just after any occurrence of the line [_audioPlayer setNumberOfLoops:-1];:

[_audioPlayer setMeteringEnabled:YES];
[_visualizer setAudioPlayer:_audioPlayer];

With the above code, you instruct the AVAudioPlayer instance to make audio-level metering data available. You then pass _audioPlayer to the _visualizer so that it can access that data.

Now switch to VisualizerView.mm and modify it as follows:

// Add with the other imports
#import "MeterTable.h"
 
// Change the private variable section of the implementation to look like this
@implementation VisualizerView {
    CAEmitterLayer *emitterLayer;
    MeterTable meterTable;
}

The above code gives you access to a MeterTable instance named meterTable. The starter project includes the C++ class MeterTable, which you’ll use to help process the audio levels fromAVAudioPlayer.

What’s all this talk about metering? It should be easy to understand once you see the image below:

You’ve most likely seen something similar on the front of a sound system, bouncing along to the music. It simply shows you the relative intensity of the audio at any given time. MeterTable is a helper class that can be used to divide decibel values into ranges used to produce images like the one above.

You will use MeterTable to convert values into a range from 0 to 1 and you will use that new value to adjust the size of the particles in your music visualizer.

Add the following method to VisualizerView.mm:

- (void)update
{
    // 1
    float scale = 0.5;
    if (_audioPlayer.playing )
    {
        // 2
        [_audioPlayer updateMeters];
 
        // 3
        float power = 0.0f;
        for (int i = 0; i < [_audioPlayer numberOfChannels]; i++) {
            power += [_audioPlayer averagePowerForChannel:i];
        }
        power /= [_audioPlayer numberOfChannels];
 
        // 4
        float level = meterTable.ValueAt(power);
        scale = level * 5;
    }
 
    // 5
    [emitterLayer setValue:@(scale) forKeyPath:@"emitterCells.cell.scale"];
}

Each time the above method is called, it updates the size of the visualizer’s particles. Here’s how it works:

  1. You set scale to a default value of 0.5 and then check to see whether or not _audioPlayer is playing.
  2. If it is playing, you call updateMeters on _audioPlayer, which refreshes the AVAudioPlayer data based on the current audio.
  3. This is the meat of the method. For each audio channel (e.g. two for a stereo file), the average power for that channel is added to power. The average power is a decibel value. After the powers of all the channels have been added together, power is divided by the number of channels. This means power now holds the average power, or decibel level, for all of the audio.
  4. Here you pass the calculated average power value to meterTable‘s ValueAt method. It returns a value from 0 to 1, which you multiply by 5 and then set that as the scale. Multiplying by 5 accentuates the music’s effect on the scale.

    Note: Why use meterTable to convert power‘s value? The reason is that it simplifies the code that you have to write. Otherwise, your code will have to cover broad range of values returned by averagePowerForChannel. A return value of 0 indicates full scale, or maximum power; a return value of -160 indicates minimum power (that is, near silence). But the signal provided to the audio player may actually exceed the range of what’s considered full scale, so values can still go beyond those limits. Using meterTable gives you a nice value from 0 to 1. No fuss, no muss.

  5. Finally, the scale of the emitter’s particles is set to the new scale value. (If _audioPlayer was not playing, this will be the default scale of 0.5; otherwise, it will be some value based on the current audio levels.

Right now your app doesn’t call update and so the new code has no effect. Fix that by modifyinginitWithFrame: in VisualizerView.mm by adding the following lines just after emitterLayer.emitterCells = @[cell]; (but still inside the closing curly brace):

CADisplayLink *dpLink = [CADisplayLink displayLinkWithTarget:self selector:@selector(update)];
[dpLink addToRunLoop:[NSRunLoop currentRunLoop] forMode:NSRunLoopCommonModes];

Here you set up a CADisplayLink. A CADisplayLink is a timer that allows your application to synchronize its drawing to the refresh rate of the display. That is, it behaves much like a NSTimer with a 1/60 second time interval, except that it’s guaranteed to be called each time the device prepares to redraw the screen, which is usually at a rate of 60 times per second.

The first line you added above creates an instance of CADisplayLink set up to call update on the targetself. That means it will call the update method you just defined during each screen refresh.

The second line calls addToRunLoop:forMode:, which starts the display link timer.

Note: Adding the CADisplayLink to a run loop is a low-level concept related to threading. For this tutorial, you just need to understand that the CADisplayLink will be called for every screen update. But if you want to learn more, you can check out the class references for CADisplayLink orNSRunLoop, or read through the Run Loops chapter in Apple’s Threading Programming Guide.

Now build, run, and play some music. You will notice that particles will change their size but they don’t “beat” with the music. This is because the change we make can not affect the particles that already exist on the screen. Only new particles are changed.

This needs to be fixed.

Open VisualizerView.mm and modify initWithFrame: as follows:

    // Remove this line
    // cell.contents = (id)[[UIImage imageNamed:@"particleTexture.png"] CGImage];
 
    // And replace it with the following lines
    CAEmitterCell *childCell = [CAEmitterCell emitterCell];
    childCell.name = @"childCell";
    childCell.lifetime = 1.0f / 60.0f;
    childCell.birthRate = 60.0f;
    childCell.velocity = 0.0f;
 
    childCell.contents = (id)[[UIImage imageNamed:@"particleTexture.png"] CGImage];
 
    cell.emitterCells = @[childCell];

Like CAEmitterLayerCAEmitterCell also has a property named emitterCells. This means that a CAEmitterCellcan contain another CAEmitterCell. This results in particles emitting particles. That’s right, folks, it’sturtles particles all the way down! :]

Also notice that you set the child’s lifetime to 1/60 seconds. This means that particles emitted bychildCell will have a lifetime which is the same length as a screen refresh. You set birthRate to 60, which means that there will be 60 particles emitted per second. Since each dies in 1/60th of a second, there will always be a particle created when the previous particle dies. And you thought your day was short :]

Build and run, you will see the particle system works the same as it did before – but it still doesn’t beat to the music. You can try setting birthRate to 30 to help you understand how the setting works (just don’t forget to set it back to 60).

So how do you get the particle system to beat to the music?

The last line of update currently looks like this:

[emitterLayer setValue:@scale forKeyPath:@"emitterCells.cell.scale"];

Replace that line with the following:

[emitterLayer setValue:@(scale) forKeyPath:@"emitterCells.cell.emitterCells.childCell.scale"];

Now build and run, you will see that all the particles beat with your music now.


So what did the above change do?

Particles are created and destroyed at the same rate as a screen refresh. That means that every time the screen is redrawn, a new set of particles is created and the previous set is destroyed. Since new particles are always created with a size calculated from the audio-levels at that moment, the particles appear to pulse with the music.

Congratulations, you have just made a cool music visualizer application!

Where to go from here?

Here is the complete example project with all of the code from the above tutorial.

This tutorial gave you a basic idea as to how to add a music visualisation system to your app. But you can take it further:

  • You can add more music controls to make the project a fully functional music player.
  • You could create a slightly more sophisticated visualizer that modified a separate particle system for each audio channel, rather than blending all audio channels into a single value.
  • Try creating a different kinds of particle systems (this tool, UIEffectDesigner, may help).
  • Or maybe try changing the shape of your emitter layer and moving it around within the view.

While you’re at it, check out Apple’s sample project aurioTouch2. It’s an advanced use of music visualization and a great way to learn more about the subject.

Have fun!

How To Make a Music Visualizer in iOS的更多相关文章

  1. (史上最全的ios源码汇总)

    按钮类         按钮 Drop Down Control         http://www.apkbus.com/android-106661-1-1.html 按钮-Circular M ...

  2. iOS苹果官方Demo合集

    Mirror of Apple’s iOS samples This repository mirrors Apple’s iOS samples. Name Topic Framework Desc ...

  3. iOS可视化动态绘制连通图

    上篇博客<iOS可视化动态绘制八种排序过程>可视化了一下一些排序的过程,本篇博客就来聊聊图的东西.在之前的博客中详细的讲过图的相关内容,比如<图的物理存储结构与深搜.广搜>.当 ...

  4. 【疯狂造轮子-iOS】JSON转Model系列之二

    [疯狂造轮子-iOS]JSON转Model系列之二 本文转载请注明出处 —— polobymulberry-博客园 1. 前言 上一篇<[疯狂造轮子-iOS]JSON转Model系列之一> ...

  5. 【疯狂造轮子-iOS】JSON转Model系列之一

    [疯狂造轮子-iOS]JSON转Model系列之一 本文转载请注明出处 —— polobymulberry-博客园 1. 前言 之前一直看别人的源码,虽然对自己提升比较大,但毕竟不是自己写的,很容易遗 ...

  6. iOS总结_UI层自我复习总结

    UI层复习笔记 在main文件中,UIApplicationMain函数一共做了三件事 根据第三个参数创建了一个应用程序对象 默认写nil,即创建的是UIApplication类型的对象,此对象看成是 ...

  7. iOS代码规范(OC和Swift)

    下面说下iOS的代码规范问题,如果大家觉得还不错,可以直接用到项目中,有不同意见 可以在下面讨论下. 相信很多人工作中最烦的就是代码不规范,命名不规范,曾经见过一个VC里有3个按钮被命名为button ...

  8. JS调用Android、Ios原生控件

    在上一篇博客中已经和大家聊了,关于JS与Android.Ios原生控件之间相互通信的详细代码实现,今天我们一起聊一下JS调用Android.Ios通信的相同点和不同点,以便帮助我们在进行混合式开发时, ...

  9. 告别被拒,如何提升iOS审核通过率(上篇)

    iOS审核一直是每款移动产品上架苹果商店时面对的一座大山,每次提审都像是一次漫长而又悲壮的旅行,经常被苹果拒之门外,无比煎熬.那么问题来了,我们有没有什么办法准确把握苹果审核准则,从而提升审核的通过率 ...

随机推荐

  1. 6、后记:PMO项目管理 - PMO项目管理办公室

    PMO项目管理办公室的作用,按笔者所简化的理解,其作用就是将项目资源标准化.规范化.文档化.然后,根据实际的项目情况,对项目间的资源内容进行协调沟通和培训,让项目组能够更快更好的完成项目建设任务. 不 ...

  2. web.xml 文件配置01

    web.xml 文件配置01   前言:一般的web工程中都会用到web.xml,方便开发web工程.web.xml主要用来配置Filter.Listener.Servlet等.但是要说明的是web. ...

  3. ABAP 根据操作员分组发送邮件

    1,获取操作员姓名 SELECT SINGLE ADRP~NAME_TEXT INTO GS_OUTPUT-UNAMT FROM ADRP INNER JOIN USR21 ON ADRP~PERSN ...

  4. Hadoop MapReduce编程学习

    一直在搞spark,也没时间弄hadoop,不过Hadoop基本的编程我觉得我还是要会吧,看到一篇不错的文章,不过应该应用于hadoop2.0以前,因为代码中有  conf.set("map ...

  5. BitMap排序

    问题描述:       BitMap排序思想:             用1bit位标记某个元素对应的值       优点:             效率高,不允许进行比较和移位            ...

  6. Gray Code -- LeetCode

    原标题链接: http://oj.leetcode.com/problems/gray-code/  这道题要求求出n位的格雷码相应的二进制数,主要在于找到一种格雷码的递增方法(格雷码并非唯一的,能够 ...

  7. 【转】国外程序员收集整理的PHP资源大全

    ziadoz在 Github发起维护的一个PHP资源列表,内容包括:库.框架.模板.安全.代码分析.日志.第三方库.配置工具.Web 工具.书籍.电子书.经典博文等等.伯乐在线对该资源列表进行了翻译, ...

  8. Spring基本使用方法_Bean对象

    Spring基本使用方法_Bean对象 Struts与Hibernate可以做什么事? Struts MVC中控制层解决方案.可以进行请求数据自动封装,类型转换,文件上传,效验..... Hibern ...

  9. 学习计划(一)——JavaScript

    一:与前端之缘 大一时除了上课和社团外不知道要学点什么,但是又不想睡觉打游戏,常常就是啥都想学,photoshop,premiere,After Effects都学,但始终没有明确的目标. 大二时一直 ...

  10. 在Centos环境下安装兼容Apache2.4高版本SVN服务

    在阿里云Centos环境下,搭建PHP运行环境,PHP选择了php7版本,Mysql选择了5.7版本,Apache选择了2.4版本,在搭建SVN版本控制服务过程中出现了不兼容问题,当前环境下Apach ...