- •Credits
- •Foreword
- •About the Authors
- •About the Reviewers
- •www.PacktPub.com
- •Table of Contents
- •Preface
- •Introducing SFML
- •Downloading and installation
- •A minimal example
- •A few notes on C++
- •Developing the first game
- •The Game class
- •Game loops and frames
- •Input over several frames
- •Vector algebra
- •Frame-independent movement
- •Fixed time steps
- •Other techniques related to frame rates
- •Displaying sprites on the screen
- •File paths and working directories
- •Real-time rendering
- •Adapting the code
- •Summary
- •Defining resources
- •Resources in SFML
- •Textures
- •Images
- •Fonts
- •Shaders
- •Sound buffers
- •Music
- •A typical use case
- •Graphics
- •Audio
- •Acquiring, releasing, and accessing resources
- •An automated approach
- •Finding an appropriate container
- •Loading from files
- •Accessing the textures
- •Error handling
- •Boolean return values
- •Throwing exceptions
- •Assertions
- •Generalizing the approach
- •Compatibility with sf::Music
- •A special case – sf::Shader
- •Summary
- •Entities
- •Aircraft
- •Alternative entity designs
- •Rendering the scene
- •Relative coordinates
- •SFML and transforms
- •Scene graphs
- •Scene nodes
- •Node insertion and removal
- •Making scene nodes drawable
- •Drawing entities
- •Connecting entities with resources
- •Aligning the origin
- •Scene layers
- •Updating the scene
- •One step back – absolute transforms
- •The view
- •Viewport
- •View optimizations
- •Resolution and aspect ratio
- •View scrolling
- •Zoom and rotation
- •Landscape rendering
- •SpriteNode
- •Landscape texture
- •Texture repeating
- •Composing our world
- •World initialization
- •Loading the textures
- •Building the scene
- •Update and draw
- •Integrating the Game class
- •Summary
- •Polling events
- •Window events
- •Joystick events
- •Keyboard events
- •Mouse events
- •Getting the input state in real time
- •Events and real-time input – when to use which
- •Delta movement from the mouse
- •Playing nice with your application neighborhood
- •A command-based communication system
- •Introducing commands
- •Receiver categories
- •Command execution
- •Command queues
- •Handling player input
- •Commands in a nutshell
- •Implementing the game logic
- •A general-purpose communication mechanism
- •Customizing key bindings
- •Why a player is not an entity
- •Summary
- •Defining a state
- •The state stack
- •Adding states to StateStack
- •Handling updates, input, and drawing
- •Input
- •Update
- •Draw
- •Delayed pop/push operations
- •The state context
- •Integrating the stack in the Application class
- •Navigating between states
- •Creating the game state
- •The title screen
- •Main menu
- •Pausing the game
- •The loading screen – sample
- •Progress bar
- •ParallelTask
- •Thread
- •Concurrency
- •Task implementation
- •Summary
- •The GUI hierarchy, the Java way
- •Updating the menu
- •The promised key bindings
- •Summary
- •Equipping the entities
- •Introducing hitpoints
- •Storing entity attributes in data tables
- •Displaying text
- •Creating enemies
- •Movement patterns
- •Spawning enemies
- •Adding projectiles
- •Firing bullets and missiles
- •Homing missiles
- •Picking up some goodies
- •Collision detection and response
- •Finding the collision pairs
- •Reacting to collisions
- •An outlook on optimizations
- •An interacting world
- •Cleaning everything up
- •Out of view, out of the world
- •The final update
- •Victory and defeat
- •Summary
- •Defining texture atlases
- •Adapting the game code
- •Low-level rendering
- •OpenGL and graphics cards
- •Understanding render targets
- •Texture mapping
- •Vertex arrays
- •Particle systems
- •Particles and particle types
- •Particle nodes
- •Emitter nodes
- •Affectors
- •Embedding particles in the world
- •Animated sprites
- •The Eagle has rolled!
- •Post effects and shaders
- •Fullscreen post effects
- •Shaders
- •The bloom effect
- •Summary
- •Music themes
- •Loading and playing
- •Use case – In-game themes
- •Sound effects
- •Loading, inserting, and playing
- •Removing sounds
- •Use case – GUI sounds
- •Sounds in 3D space
- •The listener
- •Attenuation factor and minimum distance
- •Positioning the listener
- •Playing spatial sounds
- •Use case – In-game sound effects
- •Summary
- •Playing multiplayer games
- •Interacting with sockets
- •Socket selectors
- •Custom protocols
- •Data transport
- •Network architectures
- •Peer-to-peer
- •Client-server architecture
- •Authoritative servers
- •Creating the structure for multiplayer
- •Working with the Server
- •Server thread
- •Server loop
- •Peers and aircraft
- •Hot Seat
- •Accepting new clients
- •Handling disconnections
- •Incoming packets
- •Studying our protocol
- •Understanding the ticks and updates
- •Synchronization issues
- •Taking a peek in the other end – the client
- •Client packets
- •Transmitting game actions via network nodes
- •The new pause state
- •Settings
- •The new Player class
- •Latency
- •Latency versus bandwidth
- •View scrolling compensation
- •Aircraft interpolation
- •Cheating prevention
- •Summary
- •Index
Chapter 9
In the Button::activate() method, which is called when a button is clicked, we'll play the corresponding sound:
void Button::activate()
{
...
mSounds.play(SoundEffect::Button);
}
Several Button objects are instantiated in their corresponding state classes. As a short reminder, here is an excerpt of the constructor of such a state class. The context is passed to the buttons, which extracts the sound player from it. Additionally, the music theme is played in that constructor:
MenuState::MenuState(StateStack& stack, Context context) : State(stack, context)
, ...
{
auto playButton = std::make_shared<GUI::Button>(context); auto settingsButton = std::make_shared<GUI::Button>(context); auto exitButton = std::make_shared<GUI::Button>(context);
...
context.music->play(Music::MenuTheme);
}
Now you'll hear a sound every time you activate a button.
Sounds in 3D space
The most interesting part about sound effects is yet to come. An immersive atmosphere only builds up if sounds are properly located within the game world. Like graphical objects, sounds can have a position.
The coordinate system for sounds is three-dimensional. SFML's sound API works with the sf::Vector3f type, a 3D vector with the members x, y, and z. SFML internally uses Open Audio Library (OpenAL), an interface for low-level audio functionality, which is also the origin of the sound spatialization concepts we are going to discuss here. Spatializing sounds means nothing more than to locate them in the 3D space, that is, to give them a spatial representation.
[ 225 ]
www.it-ebooks.info
Cranking Up the Bass – Music and Sound Effects
The listener
The audition of spatial sound effects depends on the listener. A useful analogy is to compare the listener with your head. The listener's location and orientation can be described with the following three 3D vectors:
•Position: This vector describes where the listener is located in 3D space.
•Up: This vector tells where the top of the head points to. In SFML 2.0, the up vector is hardcoded to (0, 1, 0), thus "up" lies always in the +Y direction.
•Direction: This vector expresses where the listener is "looking". It is a relative vector, not a position in space. SFML uses a default direction of (0, 0, -1), meaning that the listener is headed towards the negative Z axis. It must be linearly independent from the up vector, so don't choose a direction with both X and Z axes set to zero. The direction vector need not have a unit length.
|
|
up |
Y |
position |
|
|
direction |
|
|
|
|
|
X |
|
Z |
|
|
The orientation of the listener determines how sounds are perceived. If a sound is played on the right-hand side of the listener, the user will hear it in its right headphone. If you have a surround system, you will also be able to differentiate between front and rear sounds.
SFML provides the sf::Listener class which has the setPosition() and setDirection() methods to set the corresponding attributes. The sf::Listener class contains only static methods, it is not intended to be instantiated.
[ 226 ]
www.it-ebooks.info
Chapter 9
Attenuation factor and minimum distance
Close sounds are perceived louder than distant ones. The sound's volume is inversely proportional to its distance from the listener (we have a 1/distance relationship, also known as the inverse distance model).
The attenuation factor determines how fast a sound is attenuated depending on the distance. The higher the factor, the weaker the sound becomes for a given distance or the closer the sound has to get to be played at a given volume.
The minimum distance is the distance between the listener and the sound at which 100 percent volume is achieved. If the sound comes closer, the volume will not increase anymore. If the sound goes further away, it will be attenuated.
The following figure should give you a better understanding of the perceived sound volume depending on its distance to the listener. There are two cases: Distances smaller than the minimum distance yield a constant volume of 100%, bigger distances lead to attenuated sounds depending on the attenuation factor.
Gain |
|
|
100% volume |
|
attenuated volume |
min distance |
Distance |
|
Attenuation factor and minimum distance are specific to each sound. SFML
provides the setAttenuation() and setMinDistance() methods in the sf::Sound class. For sound spatialization, the setPosition() method is required to position a sound in space.
Positioning the listener
Our sounds are located in the plane of the monitor, thus their Z coordinate is always
[ 227 ]
www.it-ebooks.info
Cranking Up the Bass – Music and Sound Effects
0. But how do we place the listener? It is tempting to set the listener's Z coordinate to
0, just like the sounds. This is wrong. When you do this, a sound moving from left to right will pass directly through the listener. As a result, you first hear the sound only in the left ear, and only afterwards in the right ear. Even if the sound is very close, you will not hear it in both ears.
To get around this issue, we place the listener in a plane different than the sound—just like your head is in front of the monitor, and not inside it. The listener's Z coordinate therefore has a value greater than zero.
Let's say the 2D minimum distance is the number of world units between the listener's place in the 2D world and the sound. Since the listener itself resides outside the 2D world plane, the effective 3D minimum distance has to be computed with the
Pythagorean theorem, as shown in the following figure:
3D audio coordinates |
|
|
|
z |
listener |
|
|
|
|
|
|
y |
|
|
|
x |
listener z |
3D min distance |
|
|
|
|
|
|
|
|
2D graphics coordinates |
|
|
|
y |
|
|
|
x |
|
2D min distance |
sound |
|
|
|
Playing spatial sounds
An important fact to keep in mind: To enable audio spatialization, the sound effects must have a single channel (mono). Stereo sounds are played at full volume, regardless of their position in space.
[ 228 ]
www.it-ebooks.info
Chapter 9
In SoundPlayer.cpp, we create an anonymous namespace for a few constants representing the measures discussed before. Don't hesitate to experiment with these values!
namespace
{
const float ListenerZ = 300.f; const float Attenuation = 8.f; const float MinDistance2D = 200.f; const float MinDistance3D =
std::sqrt(MinDistance2D*MinDistance2D + ListenerZ*ListenerZ);
}
Now we need to connect the 3D audio coordinate system with our 2D graphics coordinate system. Both coordinate systems are completely unrelated; how to map one onto another depends on the use case. In the previous figure, you see how the axes of both coordinate systems are aligned.
We implement the remaining functions of the sound player, beginning with the listener's position. The X coordinate is the same in both graphics and audio system. The Y coordinate needs to be negated, as the audio up vector is in the +Y direction, but on the screen, the Y axis points downwards. For the Z coordinate, we take the constant distance between the listener and the screen plane:
void SoundPlayer::setListenerPosition(sf::Vector2f position)
{
sf::Listener::setPosition(position.x, -position.y, ListenerZ);
}
Next, we define the second overload of the play() function, which takes a 2D graphics position. To compute the 3D audio position, we again need to negate Y, but here we use Z=0. We also set the attenuation and minimum distance for the sound.
Finally, the sound is played:
void SoundPlayer::play(SoundEffect::ID effect, sf::Vector2f position)
{
mSounds.push_back(sf::Sound()); sf::Sound& sound = mSounds.back();
sound.setBuffer(mSoundBuffers.get(effect)); sound.setPosition(position.x, -position.y, 0.f); sound.setAttenuation(Attenuation); sound.setMinDistance(MinDistance3D);
sound.play();
}
[ 229 ]
www.it-ebooks.info