2012-02-27 3 views
5

간단한 게임에서 오디오 파일을 재생하기위한 간단한 클래스를 작성했습니다. 총소리 나 폭발과 같은 작은 소리는 정상적으로 작동하지만 배경 음악으로 사용하려고하면 '클립 데이터를 할당하지 못했습니다 : 요청 된 버퍼가 너무 큽니다.'라는 오류가 발생합니다. 나는 이것이 파일이 너무 크다는 것을 의미한다고 가정하고 있지만 어떻게이 문제를 해결할 수 있습니까? 출처 :긴 AudioClip은 어떻게 재생합니까?

import java.io.File; 
import javax.sound.sampled.AudioInputStream; 
import javax.sound.sampled.AudioSystem; 
import javax.sound.sampled.Clip; 

public class Sound{ 

private Clip clip; 

public Sound(String filepath){ 
    System.out.println(filepath); 
    File file = new File(filepath); 
    try { 
     clip = AudioSystem.getClip(); 
     AudioInputStream inputStream = AudioSystem.getAudioInputStream(file); 
     clip.open(inputStream); 
    } catch (Exception e) { 
     System.err.println(e.getMessage()); 
    } 
} 

public void play(){ 
    System.out.println("play"); 
    if(clip.isActive()){ 
     clip.stop(); 
    } 
    clip.setFramePosition(0); 
    clip.start(); 
} 

public void stop(){ 
    clip.stop(); 
} 

public void loop(){ 
    if(!clip.isActive()){ 
     clip.setFramePosition(0); 
     clip.loop(Clip.LOOP_CONTINUOUSLY); 
    }else{ 
     System.out.println("ALREADY PLAYING"); 
    } 

} 

public boolean getActive(){return clip.isActive();} 
} 
+0

SourceDataLine은 일반적으로 긴 사운드 또는 배경 음악에 사용됩니다. 나는 왜 그것을 사용하지 않기로 결정했는지 궁금합니다. –

답변

8

사용 BigClip. 12-18 분 (또는 그 이상이)의 MP3를 재생하기 위해 함께 사용한 수업입니다.

실제로는 MP3 형식의 사운드를로드하려면 런타임 클래스 경로에 mp3plugin.jar이 필요하지만 이는 중요하지 않습니다. 요점은 :

  1. BigClip

    는 JVM이 OutOfMemoryError 전에 할 수있는 최대 메모리에 사운드 파일을로드합니다.

import java.awt.Component; 
import javax.swing.*; 
import javax.sound.sampled.*; 
import java.io.*; 
import java.util.logging.*; 
import java.util.Arrays; 

import java.net.URL; 
import javax.swing.JOptionPane; 

class BigClipExample { 

    public static void main(String[] args) throws Exception { 
     URL url = new URL("http://pscode.org/media/leftright.wav"); 
     BigClip clip = new BigClip(); 
     AudioInputStream ais = AudioSystem.getAudioInputStream(url); 
     clip.open(ais); 
     clip.start(); 
     JOptionPane.showMessageDialog(null, "BigClip.start()"); 
     clip.loop(4); 
     JOptionPane.showMessageDialog(null, "BigClip.loop(4)"); 
     clip.setFastForward(true); 
     clip.loop(8); 
     // the looping/FF combo. reveals a bug.. 
     // there is a slight 'click' in the sound that should not be audible 
     JOptionPane.showMessageDialog(null, "Are you on speed?"); 
    } 
} 

/** An implementation of the javax.sound.sampled.Clip that is designed 
to handle Clips of arbitrary size, limited only by the amount of memory 
available to the app. It uses the post 1.4 thread behaviour (daemon thread) 
that will stop the sound running after the main has exited. 
<ul> 
<li>2012-02-29 - Reworked play/loop to fix several bugs. 
<li>2009-09-01 - Fixed bug that had clip ..clipped at the end, by calling drain() (before 
calling stop()) on the dataline after the play loop was complete. Improvement to frame 
and microsecond position determination. 
<li>2009-08-17 - added convenience constructor that accepts a Clip. Changed the private 
convertFrameToM..seconds methods from 'micro' to 'milli' to reflect that they were dealing 
with units of 1000/th of a second. 
<li>2009-08-14 - got rid of flush() after the sound loop, as it was cutting off tracks just 
before the end, and was found to be not needed for the fast-forward/rewind functionality it 
was introduced to support. 
<li>2009-08-11 - First binary release. 
</ul> 
N.B. Remove @Override notation and logging to use in 1.3+ 
@since 1.5 
@version 2012-02-29 
@author Andrew Thompson 
@author Alejandro Garcia */ 
class BigClip implements Clip, LineListener { 

    /** The DataLine used by this Clip. */ 
    private SourceDataLine dataLine; 

    /** The raw bytes of the audio data. */ 
    private byte[] audioData; 

    /** The stream wrapper for the audioData. */ 
    private ByteArrayInputStream inputStream; 

    /** Loop count set by the calling code. */ 
    private int loopCount = 1; 
    /** Internal count of how many loops to go. */ 
    private int countDown = 1; 
    /** The start of a loop point. Defaults to 0. */ 
    private int loopPointStart; 
    /** The end of a loop point. Defaults to the end of the Clip. */ 
    private int loopPointEnd; 

    /** Stores the current frame position of the clip. */ 
    private int framePosition; 

    /** Thread used to run() sound. */ 
    private Thread thread; 
    /** Whether the sound is currently playing or active. */ 
    private boolean active; 
    /** Stores the last time bytes were dumped to the audio stream. */ 
    private long timelastPositionSet; 

    private int bufferUpdateFactor = 2; 

    /** The parent Component for the loading progress dialog. */ 
    Component parent = null; 

    /** Used for reporting messages. */ 
    private Logger logger = Logger.getAnonymousLogger(); 

    /** Default constructor for a BigClip. Does nothing. Information from the 
    AudioInputStream passed in open() will be used to get an appropriate SourceDataLine. */ 
    public BigClip() {} 

    /** There are a number of AudioSystem methods that will return a configured Clip. This 
    convenience constructor allows us to obtain a SourceDataLine for the BigClip that uses 
    the same AudioFormat as the original Clip. 
    @param clip Clip The Clip used to configure the BigClip. */ 
    public BigClip(Clip clip) throws LineUnavailableException { 
     dataLine = AudioSystem.getSourceDataLine(clip.getFormat()); 
    } 

    /** Provides the entire audio buffer of this clip. 
    @return audioData byte[] The bytes of the audio data that is loaded in this Clip. */ 
    public byte[] getAudioData() { 
     return audioData; 
    } 

    /** Sets a parent component to act as owner of a "Loading track.." progress dialog. 
    If null, there will be no progress shown. */ 
    public void setParentComponent(Component parent) { 
     this.parent = parent; 
    } 

    /** Converts a frame count to a duration in milliseconds. */ 
    private long convertFramesToMilliseconds(int frames) { 
     return (frames/(long)dataLine.getFormat().getSampleRate())*1000; 
    } 

    /** Converts a duration in milliseconds to a frame count. */ 
    private int convertMillisecondsToFrames(long milliseconds) { 
     return (int)(milliseconds/dataLine.getFormat().getSampleRate()); 
    } 

    @Override 
    public void update(LineEvent le) { 
     logger.log(Level.FINEST, "update: " + le); 
    } 

    @Override 
    public void loop(int count) { 
     logger.log(Level.FINEST, "loop(" + count + ") - framePosition: " + framePosition); 
     loopCount = count; 
     countDown = count; 
     active = true; 
     inputStream.reset(); 

     start(); 
    } 

    @Override 
    public void setLoopPoints(int start, int end) { 
     if (
      start<0 || 
      start>audioData.length-1 || 
      end<0 || 
      end>audioData.length 
      ) { 
      throw new IllegalArgumentException(
       "Loop points '" + 
       start + 
       "' and '" + 
       end + 
       "' cannot be set for buffer of size " + 
       audioData.length); 
     } 
     if (start>end) { 
      throw new IllegalArgumentException(
       "End position " + 
       end + 
       " preceeds start position " + start); 
     } 

     loopPointStart = start; 
     framePosition = loopPointStart; 
     loopPointEnd = end; 
    } 

    @Override 
    public void setMicrosecondPosition(long milliseconds) { 
     framePosition = convertMillisecondsToFrames(milliseconds); 
    } 

    @Override 
    public long getMicrosecondPosition() { 
     return convertFramesToMilliseconds(getFramePosition()); 
    } 

    @Override 
    public long getMicrosecondLength() { 
     return convertFramesToMilliseconds(getFrameLength()); 
    } 

    @Override 
    public void setFramePosition(int frames) { 
     framePosition = frames; 
     int offset = framePosition*format.getFrameSize(); 
     try { 
      inputStream.reset(); 
      inputStream.read(new byte[offset]); 
     } catch(Exception e) { 
      e.printStackTrace(); 
     } 
    } 

    @Override 
    public int getFramePosition() { 
     long timeSinceLastPositionSet = System.currentTimeMillis() - timelastPositionSet; 
     int size = dataLine.getBufferSize()*(format.getChannels()/2)/bufferUpdateFactor; 
     int framesSinceLast = (int)((timeSinceLastPositionSet/1000f)* 
      dataLine.getFormat().getFrameRate()); 
     int framesRemainingTillTime = size - framesSinceLast; 
     return framePosition 
      - framesRemainingTillTime; 
    } 

    @Override 
    public int getFrameLength() { 
     return audioData.length/format.getFrameSize(); 
    } 

    AudioFormat format; 

    @Override 
    public void open(AudioInputStream stream) throws 
     IOException, 
     LineUnavailableException { 

     AudioInputStream is1; 
     format = stream.getFormat(); 

     if (format.getEncoding()!=AudioFormat.Encoding.PCM_SIGNED) { 
      is1 = AudioSystem.getAudioInputStream(
       AudioFormat.Encoding.PCM_SIGNED, stream); 
     } else { 
      is1 = stream; 
     } 
     format = is1.getFormat(); 
     InputStream is2; 
     if (parent!=null) { 
      ProgressMonitorInputStream pmis = new ProgressMonitorInputStream(
       parent, 
       "Loading track..", 
       is1); 
      pmis.getProgressMonitor().setMillisToPopup(0); 
      is2 = pmis; 
     } else { 
      is2 = is1; 
     } 

     byte[] buf = new byte[ 2^16 ]; 
     int totalRead = 0; 
     int numRead = 0; 
     ByteArrayOutputStream baos = new ByteArrayOutputStream(); 
     numRead = is2.read(buf); 
     while (numRead>-1) { 
      baos.write(buf, 0, numRead); 
      numRead = is2.read(buf, 0, buf.length); 
      totalRead += numRead; 
     } 
     is2.close(); 
     audioData = baos.toByteArray(); 
     AudioFormat afTemp; 
     if (format.getChannels()<2) { 
      afTemp = new AudioFormat(
       format.getEncoding(), 
       format.getSampleRate(), 
       format.getSampleSizeInBits(), 
       2, 
       format.getSampleSizeInBits()*2/8, // calculate frame size 
       format.getFrameRate(), 
       format.isBigEndian() 
       ); 
     } else { 
      afTemp = format; 
     } 

     setLoopPoints(0,audioData.length); 
     dataLine = AudioSystem.getSourceDataLine(afTemp); 
     dataLine.open(); 
     inputStream = new ByteArrayInputStream(audioData); 
    } 

    @Override 
    public void open(AudioFormat format, 
     byte[] data, 
     int offset, 
     int bufferSize) 
     throws LineUnavailableException { 
     byte[] input = new byte[bufferSize]; 
     for (int ii=0; ii<input.length; ii++) { 
      input[ii] = data[offset+ii]; 
     } 
     ByteArrayInputStream inputStream = new ByteArrayInputStream(input); 
     try { 
      AudioInputStream ais1 = AudioSystem.getAudioInputStream(inputStream); 
      AudioInputStream ais2 = AudioSystem.getAudioInputStream(format, ais1); 
      open(ais2); 
     } catch(UnsupportedAudioFileException uafe) { 
      throw new IllegalArgumentException(uafe); 
     } catch(IOException ioe) { 
      throw new IllegalArgumentException(ioe); 
     } 
     // TODO - throw IAE for invalid frame size, format. 
    } 

    @Override 
    public float getLevel() { 
     return dataLine.getLevel(); 
    } 

    @Override 
    public long getLongFramePosition() { 
     return dataLine.getLongFramePosition()*2/format.getChannels(); 
    } 

    @Override 
    public int available() { 
     return dataLine.available(); 
    } 

    @Override 
    public int getBufferSize() { 
     return dataLine.getBufferSize(); 
    } 

    @Override 
    public AudioFormat getFormat() { 
     return format; 
    } 

    @Override 
    public boolean isActive() { 
     return dataLine.isActive(); 
    } 

    @Override 
    public boolean isRunning() { 
     return dataLine.isRunning(); 
    } 

    @Override 
    public boolean isOpen() { 
     return dataLine.isOpen(); 
    } 

    @Override 
    public void stop() { 
     logger.log(Level.FINEST, "BigClip.stop()"); 
     active = false; 
     // why did I have this commented out? 
     dataLine.stop(); 
     if (thread!=null) { 
      try { 
       active = false; 
       thread.join(); 
      } catch(InterruptedException wakeAndContinue) { 
      } 
     } 
    } 

    public byte[] convertMonoToStereo(byte[] data, int bytesRead) { 
     byte[] tempData = new byte[bytesRead*2]; 
     if (format.getSampleSizeInBits()==8) { 
      for(int ii=0; ii<bytesRead; ii++) { 
       byte b = data[ii]; 
       tempData[ii*2] = b; 
       tempData[ii*2+1] = b; 
      } 
     } else { 
      for(int ii=0; ii<bytesRead-1; ii+=2) { 
       //byte b2 = is2.read(); 
       byte b1 = data[ii]; 
       byte b2 = data[ii+1]; 
       tempData[ii*2] = b1; 
       tempData[ii*2+1] = b2; 
       tempData[ii*2+2] = b1; 
       tempData[ii*2+3] = b2; 
      } 
     } 
     return tempData; 
    } 

    boolean fastForward; 
    boolean fastRewind; 

    public void setFastForward(boolean fastForward) { 
     logger.log(Level.FINEST, "FastForward " + fastForward); 
     this.fastForward = fastForward; 
     fastRewind = false; 
     flush(); 
    } 

    public boolean getFastForward() { 
     return fastForward; 
    } 

    public void setFastRewind(boolean fastRewind) { 
     logger.log(Level.FINEST, "FastRewind " + fastRewind); 
     this.fastRewind = fastRewind; 
     fastForward = false; 
     flush(); 
    } 

    public boolean getFastRewind() { 
     return fastRewind; 
    } 

    /** TODO - fix bug in LOOP_CONTINUOUSLY */ 
    @Override 
    public void start() { 
     Runnable r = new Runnable() { 
      public void run() { 
       try { 
        /* Should these open()/close() calls be here, or explicitly 
        called by user program? The JavaDocs for line suggest that 
        Clip should throw an IllegalArgumentException, so we'll 
        stick with that and call it explicitly. */ 
        dataLine.open(); 

        dataLine.start(); 

        active = true; 

        int bytesRead = 0; 
        int frameSize = dataLine.getFormat().getFrameSize(); 
        int bufSize = dataLine.getBufferSize(); 
        boolean startOrMove = true; 
        byte[] data = new byte[bufSize]; 
        int offset = framePosition*frameSize; 
        int totalBytes = offset; 
        bytesRead = inputStream.read(new byte[offset], 0, offset); 
        logger.log(Level.FINE, "bytesRead " + bytesRead); 
        bytesRead = inputStream.read(data,0,data.length); 

        logger.log(Level.FINE, "loopCount " + loopCount); 
        logger.log(Level.FINE, "countDown " + countDown); 
        logger.log(Level.FINE, "bytesRead " + bytesRead); 

        while (bytesRead != -1 && 
         (loopCount==Clip.LOOP_CONTINUOUSLY || 
         countDown>0) && 
         active) { 
         logger.log(Level.FINEST, 
          "BigClip.start() loop " + framePosition); 
         totalBytes += bytesRead; 
         int framesRead; 
         byte[] tempData; 
         if (format.getChannels()<2) { 
          tempData = convertMonoToStereo(data, bytesRead); 
          framesRead = bytesRead/ 
           format.getFrameSize(); 
          bytesRead*=2; 
         } else { 
          framesRead = bytesRead/ 
           dataLine.getFormat().getFrameSize(); 
          tempData = Arrays.copyOfRange(data, 0, bytesRead); 
         } 
         framePosition += framesRead; 
         if (framePosition>=loopPointEnd) { 
          framePosition = loopPointStart; 
          inputStream.reset(); 
          countDown--; 
          logger.log(Level.FINEST, 
           "Loop Count: " + countDown); 
         } 
         timelastPositionSet = System.currentTimeMillis(); 
         byte[] newData; 
         if (fastForward) { 
          newData = getEveryNthFrame(tempData, 2); 
         } else if (fastRewind) { 
          byte[] temp = getEveryNthFrame(tempData, 2); 
          newData = reverseFrames(temp); 
          inputStream.reset(); 
          totalBytes -= 2*bytesRead; 
         framePosition -= 2*framesRead; 
          if (totalBytes<0) { 
           setFastRewind(false); 
           totalBytes = 0; 
          } 
          inputStream.skip(totalBytes); 
          logger.log(Level.FINE, "totalBytes " + totalBytes); 
         } else { 
          newData = tempData; 
         } 
         dataLine.write(newData, 0, newData.length); 
         if (startOrMove) { 
          data = new byte[bufSize/ 
           bufferUpdateFactor]; 
          startOrMove = false; 
         } 
         bytesRead = inputStream.read(data,0,data.length); 
         if (bytesRead<0 && countDown-->1) { 
          inputStream.read(new byte[offset], 0, offset); 
          logger.log(Level.FINE, "loopCount " + loopCount); 
          logger.log(Level.FINE, "countDown " + countDown); 
          inputStream.reset(); 
          bytesRead = inputStream.read(data,0,data.length); 
         } 
        } 
        logger.log(Level.FINEST, 
         "BigClip.start() loop ENDED" + framePosition); 
        active = false; 
        countDown = 1; 
        framePosition = 0; 
        inputStream.reset(); 
        dataLine.drain(); 
        dataLine.stop(); 
        /* should these open()/close() be here, or explicitly 
        called by user program? */ 
        dataLine.close(); 
       } catch (LineUnavailableException lue) { 
        logger.log(Level.SEVERE, 
         "No sound line available!", lue); 
        if (parent!=null) { 
         JOptionPane.showMessageDialog(
          parent, 
          "Clear the sound lines to proceed", 
          "No audio lines available!", 
          JOptionPane.ERROR_MESSAGE); 
        } 
       } 
      } 
     }; 
     thread= new Thread(r); 
     // makes thread behaviour compatible with JavaSound post 1.4 
     thread.setDaemon(true); 
     thread.start(); 
    } 

    /** Assume the frame size is 4. */ 
    public byte[] reverseFrames(byte[] data) { 
     byte[] reversed = new byte[data.length]; 
     byte[] frame = new byte[4]; 

     for (int ii=0; ii<data.length/4; ii++) { 
      int first = (data.length)-((ii+1)*4)+0; 
      int last = (data.length)-((ii+1)*4)+3; 
      frame[0] = data[first]; 
      frame[1] = data[(data.length)-((ii+1)*4)+1]; 
      frame[2] = data[(data.length)-((ii+1)*4)+2]; 
      frame[3] = data[last]; 

      reversed[ii*4+0] = frame[0]; 
      reversed[ii*4+1] = frame[1]; 
      reversed[ii*4+2] = frame[2]; 
      reversed[ii*4+3] = frame[3]; 
      if (ii<5 || ii>(data.length/4)-5) { 
       logger.log(Level.FINER, "From \t" + first + " \tlast " + last); 
       logger.log(Level.FINER, "To \t" + ((ii*4)+0) + " \tlast " + ((ii*4)+3)); 
      } 
     } 

/* 
     for (int ii=0; ii<data.length; ii++) { 
      reversed[ii] = data[data.length-1-ii]; 
     } 
*/ 

     return reversed; 
    } 

    /** Assume the frame size is 4. */ 
    public byte[] getEveryNthFrame(byte[] data, int skip) { 
     int length = data.length/skip; 
     length = (length/4)*4; 
     logger.log(Level.FINEST, "length " + data.length + " \t" + length); 
     byte[] b = new byte[length]; 
     //byte[] frame = new byte[4]; 
     for (int ii=0; ii<b.length/4; ii++) { 
      b[ii*4+0] = data[ii*skip*4+0]; 
      b[ii*4+1] = data[ii*skip*4+1]; 
      b[ii*4+2] = data[ii*skip*4+2]; 
      b[ii*4+3] = data[ii*skip*4+3]; 
     } 
     return b; 
    } 

    @Override 
    public void flush() { 
     dataLine.flush(); 
    } 

    @Override 
    public void drain() { 
     dataLine.drain(); 
    } 

    @Override 
    public void removeLineListener(LineListener listener) { 
     dataLine.removeLineListener(listener); 
    } 

    @Override 
    public void addLineListener(LineListener listener) { 
     dataLine.addLineListener(listener); 
    } 

    @Override 
    public Control getControl(Control.Type control) { 
     return dataLine.getControl(control); 
    } 

    @Override 
    public Control[] getControls() { 
     if (dataLine==null) { 
      return new Control[0]; 
     } else { 
      return dataLine.getControls(); 
     } 
    } 

    @Override 
    public boolean isControlSupported(Control.Type control) { 
     return dataLine.isControlSupported(control); 
    } 

    @Override 
    public void close() { 
     dataLine.close(); 
    } 

    @Override 
    public void open() throws LineUnavailableException { 
     throw new IllegalArgumentException("illegal call to open() in interface Clip"); 
    } 

    @Override 
    public Line.Info getLineInfo() { 
     return dataLine.getLineInfo(); 
    } 

    /** Determines the single largest sample size of all channels of the current clip. 
    This can be handy for determining a fraction to scal visual representations. 
    @return Double between 0 & 1 representing the maximum signal level of any channel. */ 
    public double getLargestSampleSize() { 

     int largest = 0; 
     int current; 

     boolean signed = (format.getEncoding()==AudioFormat.Encoding.PCM_SIGNED); 
     int bitDepth = format.getSampleSizeInBits(); 
     boolean bigEndian = format.isBigEndian(); 

     int samples = audioData.length*8/bitDepth; 

     if (signed) { 
      if (bitDepth/8==2) { 
       if (bigEndian) { 
        for (int cc = 0; cc < samples; cc++) { 
         current = (audioData[cc*2]*256 + (audioData[cc*2+1] & 0xFF)); 
         if (Math.abs(current)>largest) { 
          largest = Math.abs(current); 
         } 
        } 
       } else { 
        for (int cc = 0; cc < samples; cc++) { 
         current = (audioData[cc*2+1]*256 + (audioData[cc*2] & 0xFF)); 
         if (Math.abs(current)>largest) { 
          largest = Math.abs(current); 
         } 
        } 
       } 
      } else { 
       for (int cc = 0; cc < samples; cc++) { 
        current = (audioData[cc] & 0xFF); 
        if (Math.abs(current)>largest) { 
         largest = Math.abs(current); 
        } 
       } 
      } 
     } else { 
      if (bitDepth/8==2) { 
       if (bigEndian) { 
        for (int cc = 0; cc < samples; cc++) { 
         current = (audioData[cc*2]*256 + (audioData[cc*2+1] - 0x80)); 
         if (Math.abs(current)>largest) { 
          largest = Math.abs(current); 
         } 
        } 
       } else { 
        for (int cc = 0; cc < samples; cc++) { 
         current = (audioData[cc*2+1]*256 + (audioData[cc*2] - 0x80)); 
         if (Math.abs(current)>largest) { 
          largest = Math.abs(current); 
         } 
        } 
       } 
      } else { 
       for (int cc = 0; cc < samples; cc++) { 
        if (audioData[cc]>0) { 
         current = (audioData[cc] - 0x80); 
         if (Math.abs(current)>largest) { 
          largest = Math.abs(current); 
         } 
        } else { 
         current = (audioData[cc] + 0x80); 
         if (Math.abs(current)>largest) { 
          largest = Math.abs(current); 
         } 
        } 
       } 
      } 
     } 

     // audioData 
     logger.log(Level.FINEST, "Max signal level: " + (double)largest/(Math.pow(2, bitDepth-1))); 
     return (double)largest/(Math.pow(2, bitDepth-1)); 
    } 
} 
+0

대용량 파일을로드하기에 충분한 메모리가 VM에 존재하기를 바랄 필요는 없습니다. 버퍼는 이유 때문에 발명되었습니다 ... –

+0

* "희망하지 않는 것이 현명하지 않습니다."* 그래서 나는 그렇지 않습니다. 예방 조치를 취하고 비상시에 지워질 수있는 byte []의 예비 버퍼를 선언합니다. OOME에서 복구하는 방법은 알고 있으며 유스 케이스에서는 허용됩니다. –

+0

발가락 밟을의의가 없었습니다. 나는 BigClip이 당신의 명성과 프로필을 알아 차리고 안전하다고 확신한다 :) –

2

구글은 여기에 저를 얻었다 : http://docs.oracle.com/javase/tutorial/sound/sampled-overview.html
을 내가 함께 넣어 수 있었다 세 첫 번째 섹션 감추고 후 :

을 :

import javax.sound.sampled.*; 
    import javax.sound.*; 
    import java.io.*; 

    public class Playme { 

     Playme(String filename){ 

      int total, totalToRead, numBytesRead, numBytesToRead; 
      byte[] buffer; 
      boolean   stopped; 
      AudioFormat  wav; 
      TargetDataLine line; 
      SourceDataLine lineIn; 
      DataLine.Info info; 
      File   file; 
      FileInputStream fis; 

      //AudioFormat(float sampleRate, int sampleSizeInBits, 
      //int channels, boolean signed, boolean bigEndian) 
      wav = new AudioFormat(44100, 16, 2, true, false); 
      info = new DataLine.Info(SourceDataLine.class, wav); 


      buffer = new byte[1024*333]; 
      numBytesToRead = 1024*333; 
      total=0; 
      stopped = false; 

      if (!AudioSystem.isLineSupported(info)) { 
       System.out.print("no support for " + wav.toString()); 
      } 
      try { 
       // Obtain and open the line. 
       lineIn = (SourceDataLine) AudioSystem.getLine(info); 
       lineIn.open(wav); 
       lineIn.start(); 
       fis = new FileInputStream(file = new File(filename)); 
       totalToRead = fis.available(); 



       while (total < totalToRead && !stopped){ 
        numBytesRead = fis.read(buffer, 0, numBytesToRead); 
        if (numBytesRead == -1) break; 
        total += numBytesRead; 
        lineIn.write(buffer, 0, numBytesRead); 
       } 

      } catch (LineUnavailableException ex) { 
       ex.printStackTrace(); 
      } catch (FileNotFoundException nofile) { 
       nofile.printStackTrace(); 
      } catch (IOException io) { 
       io.printStackTrace(); 
      } 
    } 

      public static void main(String[] argv) { 
       Playme mb_745 = new Playme(argv[0]); 
       //Playme mb_745 = new Playme("/R/tmp/tmp/audiodump.wav"); 

      } 
    } 

참고 아마 버그가 있음을

numBytesToRead = 1024*333; 

t SourceDataLine.write 나의 그는 javadoc는 말한다 : 그래서 지금은 실행하여이 거대한 745메가바이트 웨이브를 듣고있어

$ file /R/tmp/tmp/audiodump.wav 
/R/tmp/tmp/audiodump.wav: RIFF (little-endian) data, 
WAVE audio, Microsoft PCM, 16 bit, stereo 44100 Hz 

:

The number of bytes to write must represent 
an integral number of sample frames, such that: 
[ bytes written ] % [frame size in bytes ] == 0 

그리고 new AudioFormat(44100, 16, 2, true, false);에 갔다 정보의 출처

javac Playme.java && java Playme /R/tmp/tmp/audiodump.wav 

희망과 도움이 유용하고 행운을!

+0

OP의 질문을 처음 읽었을 때, 나는 '클립 (Clip)'을보고 실제로 필요한 * 클립 (Clip)이 필요하다는 결론에 이르렀다. 나는 더 이상 그렇게 확신하지 못합니다. 그렇지 않다면 이것은 분명히가는 길입니다. 아마도 OP가 왜 그들이 클립으로 시작해야한다고 생각하는지 명확히 할 수있을 것입니다. 순전히 전체 사운드 트랙을 루핑하기위한 것이라면,이 예제로 작업하는 것이 쉽지 않을 것이고, 'BigClip'을 사용하는 것보다 메모리 오버 헤드가 훨씬 적을 것입니다. –

+0

당신이 말했던 것처럼 클립을 사용한 이유는 루핑을 사용하는 단순성 때문이었고, 작은 사운드를 재생하는 것은 매우 쉽지만 더 좋은 방법이 있다면 그것을 시도 할 것입니다. – user1150769

0

제안 된 BigClip 코드에 몇 가지 버그가 수정되었습니다 (프레임 마이크로 초와 함께 버그가 수정되었으며 그 반대의 경우도 있음). 개인 convertFrameToM..seconds 메소드가 'milli'에서 'micro'로 다시 변경되었습니다. 이제 getMicrosecondPosition() 및 setMicrosecondPosition()이 올바르게 작동합니다. 이제 getMicrosecondPosition(), setMicrosecondPosition(), getMicrosecondLength()가 올바르게 작동합니다.

....................... 
    /** Converts a frame count to a duration in milliseconds. */ 
    private long convertFramesToMicrosecond(int frames) { 
     return (long)(frames/dataLine.getFormat().getSampleRate() * 1000000); 
    } 

    /** Converts a duration in milliseconds to a frame count. */ 
    private int convertMicrosecondToFrames(long microsecond) { 
     return (int) (microsecond/1000000.0 * dataLine.getFormat().getSampleRate()); 
    } 
....................... 
    @Override 
    public void setMicrosecondPosition(long microsecond) { 
     framePosition = convertMicrosecondToFrames(microsecond); 
    } 

    @Override 
    public long getMicrosecondPosition() { 
     return convertFramesToMicrosecond(getFramePosition()); 
    } 

    @Override 
    public long getMicrosecondLength() { 
     return convertFramesToMicrosecond(getFrameLength()); 
    } 
....................... 
관련 문제