Search Results

Search found 67075 results on 2683 pages for 'data model'.

Page 781/2683 | < Previous Page | 777 778 779 780 781 782 783 784 785 786 787 788  | Next Page >

  • SQL Server 2005 standard filegroups / files for performance on SAN

    - by Blootac
    I submitted this to stack overflow (here) but realised it should really be on serverfault. so apologies for the incorrect and duplicate posting: Ok so I've just been on a SQL Server course and we discussed the usage scenarios of multiple filegroups and files when in use over local RAID and local disks but we didn't touch SAN scenarios so my question is as follows; I currently have a 250 gig database running on SQL Server 2005 where some tables have a huge number of writes and others are fairly static. The database and all objects reside in a single file group with a single data file. The log file is also on the same volume. My interpretation is that separate data files should be used across different disks to lessen disk contention and that file groups should be used for partitioning of data. However, with a SAN you obviously don't really have the same issue of disk contention that you do with a small RAID setup (or at least we don't at the moment), and standard edition doesn't support partitioning. So in order to improve parallelism what should I do? My understanding of various Microsoft publications is that if I increase the number of data files, separate threads can act across each file separately. Which leads me to the question how many files should I have. One per core? Should I be putting tables and indexes with high levels of activity in separate file groups, each with the same number of data files as we have cores? Thank you

    Read the article

  • Windows Task Scheduler fails on EventData instruction

    - by Pete
    The Scheduled Task fails on the Event Data instruction in this XML: <ValueQueries> <Value name="eventChannel">Event/System/Channel</Value> <Value name="eventRecordID">Event/System/EventRecordID</Value> <Value name="eventData">Event/EventData/Data</Value> </ValueQueries> The other 2 fields can be passed as arguments and the EventData syntax matches other websites, so I don't know why it's failing. This is the Event Viewer XML: <Event xmlns="http://schemas.microsoft.com/win/2004/08/events/event"> <System> <Provider Name="Aptify.ExceptionManagerPublishedException" /> <EventID Qualifiers="0">0</EventID> <Level>2</Level> <Task>0</Task> <Keywords>0x80000000000000</Keywords> <TimeCreated SystemTime="2013-11-07T19:39:14.000000000Z" /> <EventRecordID>97555</EventRecordID> <Channel>Application</Channel> <Computer>[Computer Name]</Computer> <Security /> </System> <EventData> <Data>General Information ********************************************* Additional Info: ExceptionManager.MachineName: [Computer Name] ExceptionManager.TimeStamp: 11/7/2013 12:39:14 PM ExceptionManager.FullName: AptifyExceptionManagement, Version=4.0.0.0, Culture=neutral, PublicKeyToken=[key] ExceptionManager.AppDomainName: Aptify Shell.exe ExceptionManager.ThreadIdentity: ExceptionManager.WindowsIdentity: ACA_DOMAIN\pbassett 1) Exception Information ********************************************* Exception Type: Aptify.Framework.BusinessLogic.GenericEntity.AptifyGenericEntityValidationException Entity: Tasks ErrorString: Task Type "Make Contact" is not active. MachineName: [machine] CreatedDateTime: 11/7/2013 12:39:14 PM AppDomainName: Aptify Shell.exe ThreadIdentityName: WindowsIdentityName: [identity] Severity: 0 ErrorNumber: 0 Message: Task Type "Make Contact" is not active. Data: System.Collections.ListDictionaryInternal TargetSite: Boolean Save(Boolean, System.String ByRef, Sys tem.String) HelpLink: NULL Source: AptifyGenericEntity StackTrace Information ********************************************* at Aptify.Framework.BusinessLogic.GenericEntity.AptifyGenericEntity.Save(Boolean AllowGUI, String& ErrorString, String TransactionID)</Data> </EventData> </Event>

    Read the article

  • Excel 2013: Is it possible to collapse rows only in a specific column?

    - by h7u9i
    In my spreadsheet, I'm trying to figure out a way to collapse rows in a specific column. Right now, if I do Data - Group - Group... - Rows, it'll collapse the entire row. I want to collapse rows only in a specific column. Example: |---------|----------| | hi | + data | |---------|----------| | hello | + data2 | |---------|----------| | | | |---------|----------| | | | And opening data 1 would turn into: |---------|----------| | hi | - data1 | |---------|----------| | hello | point1 | |---------|----------| | | point2 | |---------|----------| | | + data2 | |---------|----------| | | | |---------|----------| | | | Is this possible to do in Excel?

    Read the article

  • KVM Slow performance on XP Guest

    - by Gregg Leventhal
    The system is very slow to do anything, even browse a local folder, and CPU sits at 100% frequently. Guest is XP 32 bit. Host is Scientific Linux 6.2, Libvirt 0.10, Guest XP OS shows ACPI Multiprocessor HAL and a virtIO driver for NIC and SCSI. Installed. CPUInfo on host: processor : 0 vendor_id : GenuineIntel cpu family : 6 model : 42 model name : Intel(R) Core(TM) i7-2600 CPU @ 3.40GHz stepping : 7 cpu MHz : 3200.000 cache size : 8192 KB physical id : 0 siblings : 8 core id : 0 cpu cores : 4 apicid : 0 initial apicid : 0 fpu : yes fpu_exception : yes cpuid level : 13 wp : yes flags : fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush dts acpi mmx fxsr sse sse2 ss ht tm pbe syscall nx rdtscp lm constant_tsc arch_perfmon pebs bts rep_good xtopology nonstop_tsc aperfmperf pni pclmulqdq dtes64 monitor ds_cpl vmx smx est tm2 ssse3 cx16 xtpr pdcm pcid sse4_1 sse4_2 x2apic popcnt tsc_deadline_timer aes xsave avx lahf_lm ida arat epb xsaveopt pln pts dts tpr_shadow vnmi flexpriority ept vpid bogomips : 6784.93 clflush size : 64 cache_alignment : 64 address sizes : 36 bits physical, 48 bits virtual power management: <memory unit='KiB'>4194304</memory> <currentMemory unit='KiB'>4194304</currentMemory> <vcpu placement='static' cpuset='0'>1</vcpu> <os> <type arch='x86_64' machine='rhel6.3.0'>hvm</type> <boot dev='hd'/> </os> <features> <acpi/> <apic/> <pae/> </features> <cpu mode='custom' match='exact'> <model fallback='allow'>SandyBridge</model> <vendor>Intel</vendor> <feature policy='require' name='vme'/> <feature policy='require' name='tm2'/> <feature policy='require' name='est'/> <feature policy='require' name='vmx'/> <feature policy='require' name='osxsave'/> <feature policy='require' name='smx'/> <feature policy='require' name='ss'/> <feature policy='require' name='ds'/> <feature policy='require' name='tsc-deadline'/> <feature policy='require' name='dtes64'/> <feature policy='require' name='ht'/> <feature policy='require' name='pbe'/> <feature policy='require' name='tm'/> <feature policy='require' name='pdcm'/> <feature policy='require' name='ds_cpl'/> <feature policy='require' name='xtpr'/> <feature policy='require' name='acpi'/> <feature policy='require' name='monitor'/> <feature policy='force' name='sse'/> <feature policy='force' name='sse2'/> <feature policy='force' name='sse4.1'/> <feature policy='force' name='sse4.2'/> <feature policy='force' name='ssse3'/> <feature policy='force' name='x2apic'/> </cpu> <clock offset='localtime'> <timer name='rtc' tickpolicy='catchup'/> </clock> <on_poweroff>destroy</on_poweroff> <on_reboot>restart</on_reboot> <on_crash>restart</on_crash> <devices> <emulator>/usr/libexec/qemu-kvm</emulator> <disk type='file' device='disk'> <driver name='qemu' type='qcow2' cache='none'/> <source file='/var/lib/libvirt/images/Server-10-9-13.qcow2'/> <target dev='vda' bus='virtio'/> <alias name='virtio-disk0'/> <address type='pci' domain='0x0000' bus='0x00' slot='0x08' function='0x0'/> </disk>

    Read the article

  • SCCM 2012 R2 - OSD Task Sequence failure on physical computers

    - by user1422136
    I'm trying to deploy windows 7 with SCCM 2012 R2 to physical desktops and laptops. But the task sequence keeps failing, no matter what I try. When I try it on a VM it works fine. However, when I try it on a physical computer it fails. So I think it has something to do with drivers, but I already tried both the "auto apply drivers" + wmi query for model method, and also the "apply driver package" + wmi query for model method. In the link below I added a zip file, containing two other zip files. One is a captured log from a failed osd on a desktop, the other is the export of my task sequence. Download zip-file with log and TS If anyone could resolve the issue, or share their own task sequence for such a task (pure sccm 2012 (R2), no MDT), that would be great.

    Read the article

  • SCCM 2012 R2 - OSD Task Sequence failure on physical computers

    - by Svanste
    I'm trying to deploy windows 7 with SCCM 2012 R2 to physical desktops and laptops. But the task sequence keeps failing, no matter what I try. When I try it on a VM it works fine. However, when I try it on a physical computer it fails. So I think it has something to do with drivers, but I already tried both the "auto apply drivers" + wmi query for model method, and also the "apply driver package" + wmi query for model method. In the link below I added a zip file, containing two other zip files. One is a captured log from a failed osd on a desktop, the other is the export of my task sequence. Download zip-file with log and TS If anyone could resolve the issue, or share their own task sequence for such a task (pure sccm 2012 (R2), no MDT), that would be great.

    Read the article

  • Formatted mac external hard drive loaded on pc

    - by kjokay
    I have an issue where it appears that an external hard drive which had been formatted on a mac system was loaded as a drive in Windows. Windows is obviously unable to read the data and now the drive won't mount in the mac. It appears that Windows overwrote something concerning the drive's information on what filesystems and types it has on it. Mac diskutility is unable to repair the drive and the partition is showing up in the utility as a FAT32. Using an applexsoft utility, I am able to verify the data is still on this drive, but I'd rather not spend $100 to save these files (its not my hard drive anyways). Is there a way I can use some UNIX commands to find out the partition information on the drive, back the raw data up on it, then restore the data back onto the drive after re-formatting it again?

    Read the article

  • Performance difference between MacBook Pro (2.8 GHz) vs Air (1.7 GHz)?

    - by jonathanconway
    I'm comparing these two Apple laptops: MacBook Pro (13", 2011 model): 2.8GHz dual-core Intel Core i7 processor with 4MB shared L3 cache 4GB (two 2GB SO-DIMMs) of 1333MHz DDR3 SDRAM AMD Radeon HD 6770M graphics processor with 1GB of GDDR5 memory on 2.4GHz configuration MacBook Air (13", 2011 model): 1.7GHz dual-core Intel Core i5 with 3MB shared L3 cache 4GB of 1333MHz DDR3 onboard memory Intel HD Graphics 3000 processor with 384MB of DDR3 SDRAM shared with main memory There's definitely a gap between them in terms of CPU speed and graphics, but what practical difference would this make on a day-to-day basis? On the one hand, I love the sleek, thin appearance of the Air. On the other hand, I don't want a machine that's going to be dog-slow when doing tasks such as running Virtual Machines, dual-booting to Windows and running multiple instances of Visual Studio, and maybe some light gaming. Is there going to be a major difference that makes the MacBook Pro a more attractive purchase?

    Read the article

  • Digital audio input on Macbook?

    - by Ken
    I have: a Macbook (not Pro), don't know the exact model but it's a Core 2 Duo 2.0GHz and probably what Wikipedia calls the "Late 2006" or "Mid 2007" model a DVD player, region-free, that has "Coax and TosLink optical digital audio outputs" I want to make an MP3 of the audio track of some DVDs (for learning a new language), and I can't use the Macbook's built-in DVD drive because it's a different region (ugh!). I'm sure I can connect the DVD player to the Macbook with an analog audio cable. However, if it's possible I'd prefer to keep the signal digital. I'm not even positive if my old Macbook has digital audio in, and if so what I need to connect to it. (I've done plenty of home audio geeking, but always in analog!) Will a "Toslink cable" plus a "Toslink Female to Mini-Plug Male Adapter" (found on Amazon) let me connect my things together? It looks like the pieces will fit but I'd like to hear someone confidently knowledgeable on the matter before I buy something. Thanks!

    Read the article

  • RAID 5 configuration and future expansion

    - by Alexis Hirst
    hi, I am building a PC to act as a file server among other things, and I was wondering whether it is a good idea to create 2 partitions on the RAID 5 array, one for OS one for data, or to have a separate disk for OS and use array for data. Also, one day i may want to add another disk to the array, so would there be any issues if I had the OS partition on the RAID5 array when it came to resizing the data partition?

    Read the article

  • Excel file growing huge (>150 MB)

    - by Josh
    There is one particular Excel file that is used by a number of employees at my company. It is edited from both Excel 2003 and 2007, with the "Sharing" feature turned on to allow multiple writers at once. The file has a decent amount of data on several sheets with some basic formatting, and used to be about 6MB, which seems reasonable for its content. But after a few weeks of editing, the file grew to 10, then 20 MB, and eventually skyrocketed to more than 150 MB, even though it still has about the same amount of data as before. It now takes 5-10 minutes to open it, and that much time again to save it. The first time this happened, I copied the content of each sheet into a new, blank workbook, and saved the new workbook; this brought it back down to about 6MB. Now, it has blown up again. The workbook uses the "Data Validation" feature to limit the values in certain columns to the contents of a few named ranges. Copying all the data into a new workbook means re-setting up all the data validation, which is a pain and not something that we want to do every month. As a troubleshooting step, I tried saving the file in "XML Spreadsheet 2003" format, hoping to get some insight into what was being stored. Sure enough, the file was almost a gig, and almost all of the 10 million lines look like this: <NamedCell ss:Name="Z_21D5114F_E50C_46AC_AA4F_C3FF540C717F_.wvu.FilterData"/> <NamedCell ss:Name="Z_1EE2BA5E_3011_4F9A_8ACD_E58835250FC4_.wvu.FilterData"/> <NamedCell ss:Name="Z_1E3BDCEA_6A72_4ECC_BF4F_7B03CC66181E_.wvu.FilterData"/> I've seen a few VBScripts online to manage and enumerate named cells that are hidden in Excel's built-in interface, though I wonder how they'd handle my 10 million named cells. What I really need, though, is an understanding of why this keeps happening. What actions in excel could be causing this?

    Read the article

  • How to record my voice on a Mac Mini with headphones?

    - by user718408
    I'm try to record my voice via the headphone on a Mac Mini, but it's not working. I saw on Apple's site that the Mac Mini can record voice, but it doesn't seem to be working for me. Here is a hardware overview: Model Name: Mac Mini Model Identifier: Macmini3,1 Processor Name: Intel Core 2 Duo Processor Speed: 2.26 GHz Number Of Processors: 1 Total Number Of Cores: 2 L2 Cache: 3 MB Memory: 4 GB Audio: Make: Intel High Definition Audio Audio ID: 65 Headphone connection: Combination Output Line Input connection: Combination Input Speaker connection: Internal S/PDIF Optical Digital Audio Output connection: Combination Output S/PDIF Optical Digital Audio Input connection: Combination Input Any ideas how I can successfully get recording working?

    Read the article

  • How to achieve the following RTO & RPO with logshipping only using SQL Server?

    - by Jimmy Chandra
    Trying to come up with viable backup restore & logshipping solution for achieving the following: 15 minutes Recovery Point Objective (no more than 15 minutes data loss at any time) 5 minutes Recovery Time Objective (must be able to get the db up and running back by 5 minutes) Considering using logshipping only (which I think is kind of pushing it, but I want to know if anyone else know how to achieve this). Some other info for consideration: Using 40 Gbit / sec fiber channel between the primary and disaster recovery (DRC) sites The sites are about 600 km apart. At close of business, the amount of data generated is predicted to be about 150 MB/sec. Log backup is planned for every 5 min. Doing some rough calculation I came up w/ the following numbers: 40 Gbit / sec = 5 MB / sec @ 100% network efficiency. 5 MB / sec = 300 MB / min. @ 300 MB / min, the total amount of data that can be transfer considering the 5min RTO is about 1.5GB, but that will left no time for the actual backup and restore, so if we cut it down to 3min logshipping time, which equals to ~900 MB over 3 minutes at 100% network efficiency, that will left about 1 min backup time and 1 minute restore time. Currently don't have any information if the system being used is capable of restoring 900 MB in 1 min, but assume it can. for COB scenario... 150 MB/sec, and considering the 3 min logshipping time, which should equal to about 27 GB of data over 3 mins...??? I think this is where the SLA will break... since there is no way to transfer 27 GB of data over a 40Gbit/sec line in 3 min. Can I get someone else opinion? I am thinking database mirroring might be a better answer for this...

    Read the article

  • Can I have 2Gbit over 1Gbit Nics

    - by Daniel
    So this really baffles me. Apparently because 1Gbit can transmit data in both directions simultaneously it should be possible to get 2Gbit of data transfer on a single NIC (1Gbit flow seend and 1Gbit receive). People claim that because 1Gbit is full-duplex (almost always) it is exactly 2Gbit in total. My intuition and electrical background tells me that something is not right here 4 twisted pairs 250Mbit capacity each gives 1Gbit. Unless it is really possible to transfer data in both directions simultaneously. I did a test with iperf. Ubuntu server 12.04 <-- MacBook Pro. Both with decent CPU speed. Tested speed of connection individually and on Mac I can see 112MB/s regardless which direction data is going. On Ubuntu with vnstat and ifstat I got 970Mbit speeds. Now, launching iperf in server mode on both machines at the same time and sending data using 2 iperf clients shows that I'm for example on Ubuntu box sending at 600Mbit, and receiving 350Mbit. which adds up to pretty much 1Gbit link. So to me there is no magical 2Gbit. Can someone confirm that or tell why I'm wrong? Another thing that confuses me i the fact that e.g. 24-port switch has for example: Throughput»up»to:»50.6Mpps Switching»capacity:»68Gbps Switch»fabric»speed:»88Gbps Which would suggest thay can handle 2GBit per port.

    Read the article

  • Apache and file permissions

    - by Matthew
    I'm running LAMP on Ubuntu 8.04. Apache's username and group are www-data. I put my connection details and AES key in a file in a directory that's not web served. I chown-ed the files to www-data:www-data and set the permissions to 700. Still, the script that require()s these files will only run if I chmod the files to 755. What am I missing?

    Read the article

  • How does iperf calculate throughput and jitter?

    - by Someone
    I've read that iperf basically tries to send as much information down a connection as quickly as possible reporting on the throughput achieved. This tool is especially useful in determining the volume of data that links between two machines can supply. is it possible to gather the same results by sending regular data, as in not testing data? what I'm trying to do is this; sending data in the foreground while in the back ground gather statistics (throughput and jitter). so can anyone tell me how iperf calculates these two values ?

    Read the article

  • How can I get the path to a Windows service executable WITHOUT using sc qc?

    - by Jared
    I need to query a windows service for the path to it's executable via the command prompt. I think the way I would do this is:sc qc myServiceName, but when I do that, I get the following error: [SC] QueryServiceConfig FAILED 122: The data area passed to a system call is too small. [SC] GetServiceConfig needs 1094 bytes I think this means that the sc command is sending a data structure to some other library that is too small for the data that needs to be returned. Instead of SC nicely retrying with a larger data structure (1094 bytes) it bombs out and gives me this ugly error message. Thanks Micro$oft. So is there a way to work around this error? I just need the path to the executable, but will parse it out of some other text if needed.

    Read the article

  • Is it possible to download minimal drivers on Windows, without the extra software packages?

    - by Anton Gogolev
    The situation in all its glory: 30 megs for ATIs' "Display Driver Only". Almost 100 megs for NVidias' GeForce/Ion bloatware. 300 megs for an HP printer driver with immense amount of crap 30 megs for a Realtek integrated sound card driver. 50 megs for a mouse driver ...and dozens and dozens of other similar examples Additinally, UI/UX on vendors' sites is really terrible: I have to carefully pick and choose the exact model of my whatever, although this packages do contain drivers for pretty much every possible hardware model out there. My question is: How and where can I download true minimal drivers without all these ATI "Installation Managers", Realtek GUIs, ASUS tools, etc.?

    Read the article

  • Windows 2008 R2: can't extend C drive, mystery partitions

    - by wfaulk
    I have a Windows 2008 R2 server running under VMware ESX 4.0.0. I have reallocated disk space to it in order to extend the C drive, but Disk Management has "Extend Volume" greyed out. DISKPART shows more partitions than Disk Management shows, including one after the volume I'm trying to extend, which would explain why Disk Management isn't allowing the extension. Disk Management shows: System Reserved / 100MB NTFS / Healthy (System) (C:) / 39.39 GB NTFS / Healthy (Boot, Page File, Crash Dump) 10.00 GB / Unallocated DISKPART shows: Partition 1 Dynamic Data 992 KB 31 KB Partition 2 Dynamic Data 100 MB 1024 KB Partition 3 Dynamic Data 39 GB 101 MB Partition 4 Dynamic Data 1024 KB 39 GB My question at this point is: what the heck are partitions 1 and 4, where did they come from, why doesn't Disk Management show them, and, most importantly, can I delete partition 4 in order to extend partition 3?

    Read the article

  • Why is a FLAC encoded from a decoded MP3 bigger than the MP3?

    - by Ryan Thompson
    To be more precise than in the title, suppose I have a MP3 file that is 320 kbps. If I decompress it, then logically, all the data except for roughly 320 kilobits out of each second of audio should be redundant data, able to be compressed away. So, when I encode the decompressed file to FLAC, or any other lossless codec, why is it so much larger? On a related note, is it theoretically possible to losslessly recover the source mp3 audio from a decompressed wav? (I know the mp3 itself is lossy. I'm asking if it's possible to re-encode without any further loss.) EDIT: Let me clarify the related question, and the rationale behind it. Suppose I have a wav that was decompressed from an MP3 file (and assume I don't have the mp3 itself for some reason). If I don't want to lose any more quality, I can re-encode it with FLAC or any other lossless encoder and get a larger file just to maintain the same quality. Or, I can re-encode it to mp3 again and get the same size as the original but lose more data. Obviously, neither of these cases is ideal. I can either have the original size or the original quality, but not both (I mean the quality of the original mp3, not the original lossless source). My question is: Can we get both? Is it theoretically possible to recover the lossy compressed data from the lossy decompressed data, without losing even more? If it is possible, I could imagine a lossless compression algorithm that compresses the audio with FLAC. Then it also scans the audio for any signs of previous lossy compression, and if detected, recompresses it losslessly to the original lossy file. Then it keeps whichever file is smaller.

    Read the article

  • Why does my Mac always crash when I enable `ask for password after screensaver ended`?

    - by Koning Baard XIV
    I have enabled these two things: Placing the mouse-pointer in the bottom-left corner of any display makes the screensaver appear After the screensaver or stand-by has ended, ask for password However, this combination always leads to this (Black Screen of Death) after entering the screensaver with the bottom-left corner: Here are my system specs: Hardware Overview: Model Name: iMac Model Identifier: iMac9,1 Processor Name: Intel Core 2 Duo Processor Speed: 2,66 GHz Number Of Processors: 1 Total Number Of Cores: 2 L2 Cache: 6 MB Memory: 2 GB Bus Speed: 1,07 GHz Boot ROM Version: IM91.008D.B08 SMC Version (system): 1.44f0 Serial Number (system): W89171JF0TF Hardware UUID: 323A90F0-8A2F-5057-B501-2087489E0DFF System Software Overview: System Version: Mac OS X 10.6.3 (10D573) Kernel Version: Darwin 10.3.0 Boot Volume: Macintosh HD Boot Mode: Normal Computer Name: YOU SHOULD NOT KNOW THIS User Name: YOU SHOULD NOT KNOW THIS Secure Virtual Memory: Not Enabled 64-bit Kernel and Extensions: No Time since boot: 11:46 Can anyone help me? Thanks

    Read the article

  • How to make TimeMachine back up contents of any path or mounted volume

    - by Olfan
    I keep different types of data in different encrypted sparsebundle images (say, one for each client) which automatically mount upon login but can't be opened by anybody other than myself. So, after login I have a number of virtual volumes in /Volumes/ which keeps my client data both secure and organized. How do I include data inside these virtual Volumes in TimeMachine's backups, or data residing in any path on any partition/volume? I found a promising solution description at blog.eurocomp.info involving editing the com.apple.TimeMachine.plist but all I can get TimeMachine to do is backing up the sparsebundle files themselves. I want it to back up the files inside the mounted image, though - something like adding /Volumes/Client_abc/ to TimeMachine's search path. Please do not redirect my to this previous question as it doesn't solve the problem at all. Please also refrain from telling me why you think I should not want this answer as that will not solve anything either. Please lastly don't say "it can't be done" unless you can technically prove that claim.

    Read the article

  • Adding a single 300Gb SCSI drive to poweredge 2850

    - by John Steele
    I have a 2850 setup with 3 146Gb drives, two partitions 1 12GB system with server 2003 sp2 and 1 261Gb Data. I am strapped on disk space on the data partition having to push data around. I wanted to add a 300Gb single drive for lesser critical data, is this possible? Or is it best to add 2 300Gb drives for another RAID 1 configuration? This is my church network and while it is mission critical it is not enterprise so I can take it down for a few hours. Any pointers to documentation or direct help would be greatly appreciated. John

    Read the article

  • ruby: invalid opcode

    - by adamo
    There's a fairly complex application than runs on two VMs (on Xen). Both VMs run CentOS 6.2 with the exact same packages and configuration for every application running (minus networking which is different). SELinux is disabled on both. On machine A the application builds perfectly. On machine B when running some tests we get: ruby[2010] trap invalid opcode ip:7ff9d2944c30 sp:7fff9797e0f8 error:0 in ld-2.12.so[7ff9d2930000+20000] Digging a bit more to find out where the machines differ, machine A has: model name : Six-Core AMD Opteron(tm) Processor 2423 HE and machine B: model name : AMD Opteron(TM) Processor 6272 I've tried booting machine B with cpuid_mask_cpu=fam_10_rev_c in grub but it did not help either. So any advice as to how to deal with this, or how to approach the hosting provider so as to run this VM on another physical machine will be greatly appreciated.

    Read the article

  • Alternatives to Remote Storage Service under Windows Server 2008 R2

    - by ObligatoryMoniker
    I am working on setting up a new Windows Server 2008 R2 file server for our organization and felt like the functionality offered by the Remote Storage Service in previous versions of Windows would meet our needs for segmenting our data so that we can have different backup schedules for different tiers of data based on the frequency of that data being used and updated. What software exists that provide this same or similar functionality for Server 2008 R2?

    Read the article

< Previous Page | 777 778 779 780 781 782 783 784 785 786 787 788  | Next Page >