Home > Sony > Digital Producer > Sony Vegas 5 Manual

Sony Vegas 5 Manual

    Download as PDF Print this page Share this page

    Have a look at the manual Sony Vegas 5 Manual online for free. It’s possible to download the document as PDF or print. UserManuals.tech offer 980 Sony manuals and user’s guides for free. Share the user manual or guide on Facebook, Twitter or Google+.

    							APPENDIX ATROUBLESHOOTING
    319
    Why can’t I work with footage captured using an MJPEG card?
    Vegas software requires that you have the MJPEG codec (for the MJPEG card used to capture the video) 
    installed locally on your workstation. Check to make sure that the appropriate MJPEG codec is installed on 
    your PC. 
    						
    							320
    TROUBLESHOOTINGAPPENDIX A
    Trouble-free video: software solutions
    There are literally dozens of possible configurations of hardware for editing video on a PC. While it is 
    impossible to go into detail for each and every system, the following explains some of the concepts behind 
    the various settings in Vegas software. Editing and playing back full-frame, 30 fps video is one of the most 
    demanding activities for any computer. The hardware you use is an important part of the equation, but there 
    are a number of things you can do to optimize your PC for video. The following list is arranged from the most 
    to the least important.
    Close all other applications. When capturing video or playing it back, it is critical that no other 
    applications interrupt this process. Close any applications that are not vital. This includes screen savers, 
    task schedulers, and even virus-detection software. You can ensure that you have closed all unnecessary 
    applications by pressing  , selecting the individual applications, and clicking the 
    End Task 
    button to close them. Certain processes are required and should not (cannot) be terminated (for example, 
    Explorer).
    Check your virtual memory. Windows operating system uses virtual memory when RAM is low. This is a 
    method for Windows to use the hard disk to create more memory and is sometimes called a paging file. If 
    Windows tries to write to the paging file during playback or capture, this can interrupt the video software 
    and cause problems. Make sure that a different disk drive is being used for virtual memory other than the 
    one from which you are capturing or playing your video. If you have enough space, use C: for virtual 
    memory and use a physically distinct drive for capturing and playing back video.
    Make sure you have the latest drivers for your video card and capture card and the latest updates and 
    patches to all relevant software. One caveat to this is that you shouldn’t try to fix a program that is 
    working correctly. Many times patches and updates fix relatively minor bugs that only affect a small 
    number of users. If you are not experiencing any problems, it is probably best not to upgrade unless the 
    manufacturer recommends it.
    Uncompressed video may be high quality, but it results in very large files with very high data rates. 
    Selecting a more appropriate compression scheme (codec) will definitely improve the situation. If you are 
    creating movies that need maximum quality, however, this may not be an option.
    Trouble-free video: hardware solutions
    Even with a fast computer, video is still a hardware challenge. On the other hand, it is definitely possible to 
    properly configure a 400 MHz Pentium to work with large video files. There are three parts of your PC that 
    are important and the speed of your CPU is not necessarily the most critical. The following list is arranged 
    from the most to the least important.
    Video subsystem
    Many graphics cards (video boards, primary display cards) on a PC cannot handle full-screen, full-frame rate 
    video. While this leads to jerky, hesitating playback, it may not actually be a serious problem. A common 
    video configuration is to have a separate video capture card and a primary display card. In this case, the 
    playback using the primary display on the computer may be jerky, but when you finally output the video to 
    tape and view it on your television monitor there may not be any problems. If you are not creating movies to 
    go back to the television or VCR and you are experiencing stuttering playback, you should consider using a 
    smaller frame size (320X240) and frame rate (15 fps).
    Ctrl+Alt+ Delete 
    						
    							APPENDIX ATROUBLESHOOTING
    321
    Hard disk
    The second most common problem is slow hard disks. Until recently, fast, expensive SCSI AV hard disks 
    were required to properly capture and play back video on a PC. Slow hard disk problems also manifest 
    themselves with jerky video playback, although the stutters are less frequent and of longer duration than if 
    the video subsystem is the problem. Slower hard disks (e.g., 5400 RPM IDE) can cause an occasional 
    dropped frame. DV enthusiasts have fewer problems due to the low data rate (~3.6 MB/sec.) of that format. 
    The following section outlines some recommendations arranged in order of importance.
    Buy a dedicated video drive. This is easily the most important piece of hardware advice. A dedicated, 
    physically distinct hard drive is almost a requirement for any type of serious video work. This means that 
    you have one primary C:\ drive (or wherever your operating system is installed) and a separate drive for 
    video. You can use your dedicated drive for other purposes, especially storage, but it is a good idea not to 
    run any applications from it and to keep Windows virtual memory off of it. It is very important that the 
    drive only be used for video when playing and capturing, and that other programs (including Windows) 
    are not trying to access it. Since video files are so large, a dedicated drive is not an unreasonable item even 
    if digital video is just a hobby. You can never have too much hard disk space.
    Buy a faster hard drive. Older 5400 RPM hard drives may not be fast enough for capturing and playing 
    back video for any length of time, while newer 7200 RPM drives are almost always adequate. Be careful: 
    manufacturers are usually talking about burst transfer rates when they talk about the speed of a drive. A 
    drive that can transfer data at 80MB/sec is worthless for video if it cannot sustain a much slower rate of 
    8MB/sec for thirty minutes (or more) without dropping a frame. Look to other computer video enthusiasts 
    for additional advice. Again, the RPMs are a very good indicator, because 7200 RPM IDE drives are 
    usually newer (c.1998) and older 7200 RPM drives are usually SCSI, which are already higher quality 
    drives to begin with.
    IDE vs. SCSI. While this was a big issue just a few years ago, it has fortunately faded in importance. Hard 
    drives can be hooked up to your computer in a number of ways, with the two largest divisions being IDE 
    and SCSI. This interface simply determines how much data can be transferred to and from the drive in a 
    second. The interface almost always far outstrips the performance of even the best hard disks and even the 
    slower interfaces exceed the transfer requirements of video data. SCSI hard disks are usually more 
    expensive and require a special controller, and while SCSI-2 promises 80MB/sec transfer rates, this is 
    overkill for most people. Newer IDE hard disks with designations of EIDE, DMA, Ultra-DMA, ATA-33, 
    and ATA-66 (and newer drives that came out after this writing) can all handle most sustained video 
    requirements.
    CPU and RAM (memory)
    While the CPU and the RAM are probably the most important overall aspects of a PC’s speed and 
    performance, these factors are only third on the list for video. For the most part, these critical components do 
    not affect the capture or playback of video. This does not mean that a faster CPU or more RAM will not 
    help, because bigger and faster is always better: CPU and RAM definitely impact rendering speeds. Creating 
    a final AVI file, especially in a movie project that uses a lot of effects and transitions, can take a long time. A 
    thirty-minute movie could easily take six or more hours to render, depending on the format and effects used. 
    CPU speed is also important for more advanced compression codecs, such as MPEG and newer streaming 
    formats. 
    						
    							322
    TROUBLESHOOTINGAPPENDIX A
    Audio proxy files (.sfap0)
    Working with certain types of media files with particular audio compression schemes can be inefficient and 
    slow. To compensate for this, Vegas software creates audio proxy files for formats that are known to 
    dramatically impact performance. There are two cases where this occurs.
    Multimedia video files often contain both video and audio information. In certain formats, these two streams 
    can be packed together in such a way as to make editing slow and inefficient. Vegas software therefore takes 
    the audio stream from these files (e.g., type-1 DV, QuickTime™ 4) and saves it to a separate and more 
    manageable audio proxy file.
    QuickTime 4 audio-only files can also be compressed in a way that makes editing slower. Vegas software also 
    uses audio proxy files in this situation as well. While audio proxy files may be large (because they are 
    uncompressed), the performance increase is significant.
    The file is saved as a proprietary .sfap0 file, with the same name as the original media file and has the same 
    characteristics as the original audio stream. So movie.avi yields a movie.avi.sfap0 audio proxy. Additional 
    audio streams in the same file are saved as movie.avi.sfap1, movie.avi.sfap2, etc. This is a one-time process 
    that greatly speeds up editing. The conversion happens automatically and does not result in a loss of quality 
    or synchronization. The original source file remains unchanged (the entire process is nondestructive). Audio 
    proxy files can be safely deleted at any time since the application recreates these files as needed.
    Note: Vegas software saves audio proxy files to the same 
    folder as the source media. If the source media folder is read-
    only (e.g., CD-ROM), the files are saved to a temporary 
    directory.
    Interlacing and field order
    Field order in interlaced video is an important parameter that can severely impact the quality of video on a 
    television monitor. While the concept is easy enough to understand, the lack of standards in both 
    technology and terminology clouds the issue.
    The path of the electron gun across the screen is fundamentally different between television monitors and 
    computer monitors. Computer monitors scan every line in order, from left to right and top to bottom. This is 
    known as progressive scanning. On a standard television monitor, the electron gun scans every other line 
    from top to bottom, twice for every picture or frame. For example, the first scan from top to bottom might 
    scan all of the odd numbered lines first, then jump back to the top of the screen and, in the second scan, 
    draw all of the remaining even numbered lines, completing the frame. The two fields are said to be interlaced 
    together to form a single frame. 
    The illustration that follows shows how two frames in a video are actually composed of two fields each, for a 
    total of four fields. These fields can be referred to as field one (F1) and field two (F2). Obviously, it is critical 
    that these two fields are paired together to create a whole frame. What may not be so obvious is that the 
    actual order of these two fields is not particularly important. In other words, F1 could be scanned first and 
    then F2, or F2 could be scanned first and then F1. Both situations would create a perfectly valid, error-free 
    frame of video. While that may sound like good news, in reality this is the source of all of the problems 
    associated with field order. Since both methods are technically correct, both methods have been used. It is 
    important to use the correct order when rendering video files for your particular hardware (capture card). 
    						
    							APPENDIX ATROUBLESHOOTING
    323
    The next illustration shows the effects of incorrectly interlacing a frame of video. In this case, F2 from frame 
    one is combined with F1 from frame two. Remember that there is nothing inherently right or wrong with a 
    field order of F2/F1; it just happens to be wrong in this case. At a minimum, this can create slightly blurry or 
    hazy video. In most situations, the video is jumpy or jittery and is unwatchable. Interlacing problems can be 
    especially noticeable when two adjacent frames are significantly different; for example, at a cut or in video 
    with fast moving action. It can also manifest itself in certain computer-generated special effects; for example, 
    in slow-motion sequences.
    The basic problem is that there is no standard correct field order. Some capture cards use F1/F2 and some use 
    F2/F1. If this were the extent of our troubles, we could check out our hardware manual, look up the correct 
    field order and that would be that. Unfortunately (if this information is even available) the terminology used 
    can be equally baffling. F1 may be called the odd, upper, or A field, or (more rarely) it may be called the 
    even, lower, or B field. Add into the mix the fact that the first scan line might be numbered 0 or 1 (which 
    changes whether the field is considered odd or even), and that cropping may change which line is ultimately 
    scanned first, and you can see that this is not a very clear-cut problem. The remainder of this section deals 
    with how to sort this out in Vegas software. Fortunately, you only have to determine the correct settings 
    once for any particular hardware setup.
    Identifying problems
    Vegas software refers to the two fields as upper field first and lower field first. These are probably the most 
    common terms used to distinguish the two fields, and you may find a page in your hardwares manual that 
    says something like “Use a field order of lower first.” In many cases (but not all or even most), 
    Upper=Odd=A and Lower=Even=B. 
    F1 F2 F1  F2 
    frame 1 frame 1 frame 2 frame 2
    F2 F1 
    frame on television frame 1 frame 2 
    						
    							324
    TROUBLESHOOTINGAPPENDIX A
    In the application, you can select the field order of a project by choosing Properties from the File menu and 
    clicking the 
    Video tab. The pre-configured templates should work for almost everyone (e.g., if you are editing 
    and outputting DV video in the US, select the NTSC DV template). If you have problems, you can 
    manually select a different field order on the 
    Video tab. You can also override the project settings and set the 
    field order when you render a video file. From the 
    File menu, choose Render As. Then, click the Custom 
    button and choose an option from the 
    Field order drop-down list on the Video tab. You can also set field order 
    at the level of the media file or event. Right-click a media file in the Media Pool or an event on the timeline 
    and choose 
    Properties. The Field order drop-down list appears on the Media tab.
    Interlacing problems only manifest themselves on television monitors. Video that is going to be played back 
    on a computer does not need to be interlaced, and you can select 
    None (progressive scan) for the field order. 
    Rendered video must be displayed on a television monitor to identify any problems. The only way to see 
    interlacing problems is to record (print) a rendered video file out to tape and play back the tape on a 
    television. Problems are most apparent in video that has a lot of motion or that has been modified in some 
    way; for example, a slow-motion effect. (Some codecs force the correct field order during a render, making it 
    difficult or impossible to create video with the wrong field order.)
    Solving interlacing problems in Vegas software
    If your hardware’s documentation does not contain any information about the proper field order, you must 
    determine this information for yourself. It is not a difficult process, and involves rendering one video file 
    with an upper first field order and another with a lower first field order. Source material that dramatically 
    and clearly demonstrates the improperly interlaced video is important: use a media file with a lot of motion 
    in it and then slow the event down with a velocity envelope or by time-stretching the event.
    Timecode
    Timecode is a method of labelling frames with a unique and searchable identifier. It is primarily important 
    for synchronizing video (in frames per second) with time in the real world and, in the case of Vegas software, 
    with other media in a project. 
    Changing the timecode used to measure a video file does not alter the contents of the file. For example, no 
    frames are ever dropped or removed when using SMPTE 29.97 drop frame timecode. Instead, specific frame 
    numbers are periodically dropped to compensate for differences between timecode and time in the real 
    world. Confusion between using drop versus non-drop timecode can cause synchronization problems 
    between video and audio. For very short periods of time, the error would be unnoticeable. After about a half 
    an hour, you might notice that mouths and words do not quite match in shots of people speaking. Longer 
    stretches of time show larger discrepancies in synchronization.
    Changing the timecode displayed on an event is not equivalent to converting a video to another format. You 
    cannot convert NTSC video at 29.97 fps to PAL video at 25 fps by simply changing the timecode. To 
    convert NTSC video to PAL video in Vegas software, you need to re-render the video in the new format. In 
    this situation, the conversion process necessarily results in some frames of video actually being removed from 
    the original sequence.
    SMPTE timecode types
    The following are descriptions of each of the Society of Motion Picture and Television Engineers (SMPTE) 
    timecode types.
    SMPTE 25 EBU (25 fps, Video)
    SMPTE 25 EBU timecode runs at 25 fps, and matches the frame rate used by European Broadcasting Union 
    (EBU) television systems.
    Use SMPTE 25 EBU format for PAL DV/D1 projects. 
    						
    							APPENDIX ATROUBLESHOOTING
    325
    SMPTE Drop Frame (29.97 fps, Video)
    SMPTE Drop Frame timecode runs at 29.97 fps, and matches the frame rate used by NTSC television 
    systems (North America, Japan). 
    Use SMPTE Drop Frame format for NTSC DV/D1 projects.
    Both SMPTE Drop and SMPTE Non-Drop run at 29.97 fps. In both formats, the actual frames are not 
    discarded, but they are numbered differently. SMPTE Drop removes certain frame numbers from the 
    counting system to keep the SMPTE clock from drifting from real time. The time is adjusted forward by two 
    frames on every minute boundary except 0, 10, 20, 30, 40, and 50. For example, when SMPTE Drop time 
    increments from 00:00:59.29, the next value is 00:01:00.02. 
    SMPTE Non-Drop Frame (29.97 fps, Video)
    SMPTE Non-Drop Frame timecode runs at a rate of 29.97 fps. This leads to a discrepancy between real time 
    and the SMPTE time, because there is no compensation in the counting system as there is in SMPTE Drop 
    Frame.
    Use SMPTE Non-Drop format for NTSC D1 projects that are recorded on master tapes striped with Non-
    Drop timecode.
    SMPTE 30 (30 fps, Audio)
    SMPTE 30 is an audio-only format and runs at exactly 30 fps. SMPTE 30 is commonly used when 
    synchronizing audio applications such as multitrack recorders or MIDI sequencers. This format should not be 
    used when working with video.
    SMPTE Film Sync (24 fps)
    The SMPTE Film Sync time format runs at 24 fps (frames per second). This frame rate matches the standard 
    crystal-sync 16/33 mm film rate of 24 fps.
    Timecode in Vegas software
    Video timecode crops up fairly frequently in Vegas software. Being a multimedia production tool, time in the 
    application can be measured in real-world time (hours, minutes, seconds), in video timecode (involving 
    frames of video), or in musical time (measures and beats).
    Ruler format and timecode
    The ruler in Vegas software can be set to measure time in any way that is convenient. This setting does not 
    change how the final file is rendered, but controls the grid lines and how snapping behaves. Right-click the 
    ruler and choose a time format from the shortcut list. For more information, see Changing the ruler format on 
    page 285.
    Preferences dialog timecode settings
    From the Options menu, choose Preferences and click the Video tab to adjust the Show source frame numbers 
    on event thumbnails as 
    drop-down list. These settings take precedence over those found in the source media 
    Properties dialog (see the next topic) and are displayed on events inserted into the timeline. 
    None means that 
    no numbers are displayed on events, 
    Frame Numbers marks frames in the media file starting with 0, Time 
    displays the time in seconds, and 
    Timecode allows the source media’s timecode to be detected or selected. 
    						
    							326
    TROUBLESHOOTINGAPPENDIX A
    Source media timecode format
    Right-click an event, choose Properties, and click the Media tab to view these properties. By default, Use 
    timecode in file
     is selected. 
    Note: You can override these settings by choosing different 
    settings on the 
    Video tab of the Preferences dialog. Select 
    Timecode from the Source frame numbering list to allow 
    event-level specification.
    Render media file format
    The timecode of a final rendered media file is determined by the specified format. The frame rate of the 
    project ultimately determines the timecode and is often constrained by the type of media file being rendered 
    or the codec being used for compression. For example, NTSC DV is typically limited to a frame rate of 
    29.97 fps and uses SMPTE drop frame timecode.
    Time formats in Vegas software
    A variety of time formats are provided in the application. For more information, see Changing the ruler format 
    on page 285.
    Troubleshooting DV hardware issues
    Vegas software is designed to integrate seamlessly with OHCI compliant IEEE-1394 DV video capture 
    hardware and DV camcorders. While most people never have any problems, the vast number of hardware 
    configuration possibilities makes this a potentially complex issue. There are a number of resources at the 
    Sony Pictures Digital Media Software and Services Web site that may be able to assist you.
    More detailed information is available at: 
    http://mediasoftware.sonypictures.com/Support/Productinfo/OHCI.asp
    You can also visit the Vegas Updates Web page to access a troubleshooting document for OHCI-compliant 
    devices. From the Sony Pictures Digital Media Software home page, go to the Download page and click 
    Updates. Click the Vegas Update link to access the update page. 
    						
    							B
    APPENDIX BGLOSSARY
    APPENDIX
    327
    Glossary
    A-Law
    A companded compression algorithm for voice signals defined by the Geneva Recommendations (G.711). 
    The G.711 recommendation defines A-Law as a method of encoding 16-bit PCM signals into a nonlinear 
    8-bit format. The algorithm is commonly used in United States telecommunications. A-Law is very similar 
    to µ-Law, however, each uses a slightly different coder and decoder.
    Adaptive Delta Pulse Code Modulation (ADPCM)
    A method of compressing audio data. Although the theory for compression using ADPCM is standard, there 
    are many different algorithms employed. For example, the ADPCM algorithm from Microsoft® is not 
    compatible with the International Multimedia Association’s (IMA) approved ADPCM.
    Aliasing
    A type of distortion that occurs when digitally recording high frequencies with a low sample rate. For 
    example, in a motion picture, when a car’s wheels appear to slowly spin backward while the car is quickly 
    moving forward, you are seeing the effects of aliasing. Similarly, when you try to record a frequency greater 
    than one-half of the sampling rate (the Nyquist Frequency), instead of hearing a high pitch, you may hear 
    alias frequencies in the low end of the spectrum.
    To prevent aliasing, an anti-aliasing filter is used to remove high-frequencies before recording. Once the 
    sound has been recorded, aliasing distortion is impossible to remove without also removing other frequencies 
    from the sound. This same anti-aliasing filter must be applied when resampling to a lower sample rate.
    Amplitude Modulation (AM)
    A process whereby the amplitude (loudness) of a sound is varied over time. When varied slowly, a tremolo 
    effect occurs. If the frequency of modulation is high, many side frequencies are created which can strongly 
    alter the timbre of a sound.
    Analog
    When discussing audio, this term refers to a method of reproducing a sound wave with voltage fluctuations 
    that are analogous to the pressure fluctuations of the sound wave. This is different from digital recording in 
    that these fluctuations are infinitely varying rather than discrete changes at sample time (see Quantization).
    ASIO
    ASIO (Audio Stream In/Out)™ is a low-latency driver model developed by Steinberg Media 
    Technologies AG.
    ASIO audio drivers are only supported in the full version of Vegas® software.
    Attack
    The attack of a sound is the initial portion of the sound. Percussive sounds (drums, piano, guitar plucks) are 
    said to have a fast attack. This means that the sound reaches its maximum amplitude in a very short time. 
    Sounds that slowly swell up in volume (soft strings and wind sounds) are said to have a slow attack.
    B 
    						
    							328
    GLOSSARYAPPENDIX B
    Attenuation
    A decrease in the level of an audio signal.
    Audio Compression Manager (ACM)
    The Audio Compression Manager from Microsoft® is a standard interface for audio compression and signal 
    processing for Windows. The ACM can be used by Microsoft® Windows® programs to compress and 
    decompress WAV files.
    AV I
    A file format of digital video. Vegas software allows you to open, edit and create new AVI files.
    Bandwidth
    Refers to the EQ plug-in that is built in. Each frequency band has a width associated with it that determines 
    the range of frequencies that are affected by the EQ. An EQ band with a wide bandwidth affects a wider 
    range of frequencies than one with a narrow bandwidth.
    Bandwidth can also refers to the amount of data that can be transferred via a connection, such as a network 
    or modem. For example, streaming media must be compressed due to the limited bandwidth of most Internet 
    connections.
    Beats Per Measure
    In music theory, the time signature of a piece of music contains two pieces of information: the number of 
    beats in each measure of music, and which note value gets one beat. This notion is used to determine the 
    number of ticks to put on the ruler above the track view, and to determine the spacing when the ruler 
    displays in measures and beats format.
    Beats Per Minute (BPM)
    In music theory, the tempo of a piece of music can be written as a number of beats in one minute. If the 
    tempo is 60 BPM, a single beat occurs once every second. Lower BPM’s equal slower tempo, and vice versa.
    Bit
    A bit is the most elementary unit in digital systems. Its value can only be 1 or 0, corresponding to a voltage 
    in an electronic circuit. Bits are used to represent values in the binary numbering system. As an example, the 
    8-bit binary number 10011010 represents the unsigned value of 154 in the decimal system. In digital 
    sampling (specifically the PCM format), a binary number is used to store individual sound levels, called 
    samples.
    Bit Depth
    The number of bits used to represent a single sample. Vegas software uses either 8, 16, or 24-bit samples. 
    Higher values increase the quality of the playback and any recordings that you make. While 8-bit samples 
    take up less memory (and hard disk space), they are inherently noisier than 16 or 24-bit samples.
    Bus
    A virtual pathway where signals from tracks and effects are mixed. A bus’s output can be a physical audio 
    device in the computer from which the signal is heard.
    Byte
    Refers to a set of 8 bits. An 8-bit sample requires one byte of memory to store, while a 16-bit sample takes 
    two bytes of memory to store. 
    						
    All Sony manuals Comments (0)