Search Results

Search found 3463 results on 139 pages for 'physical'.

Page 8/139 | < Previous Page | 4 5 6 7 8 9 10 11 12 13 14 15  | Next Page >

  • SQL SERVER – Simple Demo of New Cardinality Estimation Features of SQL Server 2014

    - by Pinal Dave
    SQL Server 2014 has new cardinality estimation logic/algorithm. The cardinality estimation logic is responsible for quality of query plans and majorly responsible for improving performance for any query. This logic was not updated for quite a while, but in the latest version of SQL Server 2104 this logic is re-designed. The new logic now incorporates various assumptions and algorithms of OLTP and warehousing workload. Cardinality estimates are a prediction of the number of rows in the query result. The query optimizer uses these estimates to choose a plan for executing the query. The quality of the query plan has a direct impact on improving query performance. ~ Souce MSDN Let us see a quick example of how cardinality improves performance for a query. I will be using the AdventureWorks database for my example. Before we start with this demonstration, remember that even though you have SQL Server 2014 to see the effect of new cardinality estimates, you will need your database compatibility mode set to 120 which is for SQL Server 2014. If your server instance of SQL Server 2014 but you have set up your database compatibility mode to 110 or any other earlier version, you will get performance from your query like older version of SQL Server. Now we will execute following query in two different compatibility mode and see its performance. (Note that my SQL Server instance is of version 2014). USE AdventureWorks2014 GO -- ------------------------------- -- NEW Cardinality Estimation ALTER DATABASE AdventureWorks2014 SET COMPATIBILITY_LEVEL = 120 GO EXEC [dbo].[uspGetManagerEmployees] 44 GO -- ------------------------------- -- Old Cardinality Estimation ALTER DATABASE AdventureWorks2014 SET COMPATIBILITY_LEVEL = 110 GO EXEC [dbo].[uspGetManagerEmployees] 44 GO Result of Statistics IO Compatibility level 120 Table ‘Person’. Scan count 0, logical reads 6, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0. Table ‘Employee’. Scan count 2, logical reads 7, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0. Table ‘Worktable’. Scan count 0, logical reads 0, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0. Table ‘Worktable’. Scan count 2, logical reads 7, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0. Compatibility level 110 Table ‘Worktable’. Scan count 2, logical reads 7, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0. Table ‘Person’. Scan count 0, logical reads 137, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0. Table ‘Employee’. Scan count 2, logical reads 7, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0. Table ‘Worktable’. Scan count 0, logical reads 0, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0. You will notice in the case of compatibility level 110 there 137 logical read from table person where as in the case of compatibility level 120 there are only 6 physical reads from table person. This drastically improves the performance of the query. If we enable execution plan, we can see the same as well. I hope you will find this quick example helpful. You can read more about this in my latest Pluralsight Course. Reference: Pinal Dave (http://blog.SQLAuthority.com)Filed under: PostADay, SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, T SQL

    Read the article

  • Are there any disadvantages of having a "free fall sensor" on a hard disk drive?

    - by therobyouknow
    This is a general question that came out of a specific comparison between the Western Digital Scorpio WD3200BEKT and Western Digital Scorpio WD3200BJKT (which is the same as the former but with a free fall sensor.) Note: I'm not asking for a review or appraisal of these specific drives, as the general question does apply on other brands as well. Though your input would help my decision. To break down the general question in order to answer it, I would be looking for comments on things like: if it's necessary to have differing physical dimensions between free fall sensor drives and those without, e.g. does it make it any thicker, and therefore reduce the systems where it can be installed - particularly smaller laptops? does it actually make the system less reliable - because of false alarms whereby the drive thought the laptop was falling but it wasn't? I suppose that the fact that a manufacturer produces both drives with and without free fall sensors says something about possible disadvantages. Or it could be standard marketing techniques where by making drives with and without results in larger sales volume than just those with the feature alone.

    Read the article

  • Create PDF in memory instead of physical file

    - by acadia
    How do one create PDF in memorystream instead of physical file using itextsharp. The code below is creating actual pdf file. Instead how can I create a byte[] and store it in the byte[] so that I can return it through a function using iTextSharp.text; using iTextSharp.text.pdf; Document doc = new Document(iTextSharp.text.PageSize.LETTER, 10, 10, 42, 35); PdfWriter wri = PdfWriter.GetInstance(doc, new FileStream("c:\\Test11.pdf", FileMode.Create)); doc.Open();//Open Document to write Paragraph paragraph = new Paragraph("This is my first line using Paragraph."); Phrase pharse = new Phrase("This is my second line using Pharse."); Chunk chunk = new Chunk(" This is my third line using Chunk."); doc.Add(paragraph); doc.Add(pharse); doc.Add(chunk); doc.Close(); //Close document

    Read the article

  • Create PDF in memory instead of physical file using C#

    - by acadia
    Hello, How do one create PDF in memorystream instead of physical file using itextsharp. The code below is creating actual pdf file. Instead how can I create a byte[] and store it in the byte[] so that I can return it through a function using iTextSharp.text; using iTextSharp.text.pdf; Document doc = new Document(iTextSharp.text.PageSize.LETTER, 10, 10, 42, 35); PdfWriter wri = PdfWriter.GetInstance(doc, new FileStream("c:\\Test11.pdf", FileMode.Create)); doc.Open();//Open Document to write Paragraph paragraph = new Paragraph("This is my first line using Paragraph."); Phrase pharse = new Phrase("This is my second line using Pharse."); Chunk chunk = new Chunk(" This is my third line using Chunk."); doc.Add(paragraph); doc.Add(pharse); doc.Add(chunk); doc.Close(); //Close document

    Read the article

  • Unit Testing XML independent of physical XML file

    - by RAbraham
    Hi, My question is: In JUnit, How do I setup xml data for my System Under Test(SUT) without making the SUT read from an XML file physically stored on the file system Background: I am given a XML file which contains rules for creation of an invoice. My job is to convert these rules from XMl to Java Objects e.g. If there is a tag as below in my XML file which indicates that after a period of 30 days, the transaction cannot be invoiced <ExpirationDay>30</ExpirationDay> this converts to a Java class , say ExpirationDateInvoicingRule I have a class InvoiceConfiguration which should take the XML file and create the *InvoicingRule objects. I am thinking of using StAX to parse the XML document within InvoiceConfiguration Problem: I want to unit test InvoiceConfiguration. But I dont want InvoiceConfiguration to read from an xml file physically on the file system . I want my unit test to be independent of any physical stored xml file. I want to create a xml representation in memory. But a StAX parser only takes FileReader( or I can play with the File Object)

    Read the article

  • How to find the physical path of a GSP file in a deployed grails application

    - by Deepak Mittal
    I need to find out the physical path of a grails GSP file. My requirement is that I want to create a new layout file at run-time and use that in the application. I have been able to achieve this without problem when the application runs on jetty (grails run-app), however, when I deploy the app on Jboss, the path at which the file needs to be created changes. So, ideally I would like to find out at runtime using some magical utility the path of a particular GSP (lets say main.gsp layout file) and I need to create my new layout in the same directory in which main.gsp reside. Any pointers? -Deepak

    Read the article

  • Mapping of memory addresses to physical modules in Windows XP

    - by Josef Grahn
    I plan to run 32-bit Windows XP on a workstation with dual processors, based on Intel's Nehalem microarchitecture, and triple channel RAM. Even though XP is limited to 4 GB of RAM, my understanding is that it will function with more than 4 GB installed, but will only expose 4 GB (or slightly less). My question is: Assuming that 6 GB of RAM is installed in six 1 GB modules, which physical 4 GB will Windows actually map into its address space? In particular: Will it use all six 1 GB modules, taking advantage of all memory channels? (My guess is yes, and that the mapping to individual modules within a group happens in hardware.) Will it map 2 GB of address space to each of the two NUMA nodes (as each processor has it's own memory interface), or will one processor get fast access to 3 GB of RAM, while the other only has 1 GB? Thanks!

    Read the article

  • T-SQL Table Variable Creating PHYSICAL Table!

    - by Mike
    OMG! What am I doing wrong? declare @WTF TABLE ( OrderItemId int ) SELECT TOP 20 OrderItemId as OrderItemId INTO [@WTF] FROM ac_OrderItems SELECT * FROM [@WTF] Problem A: This creates a PHYSICAL table called @WTF. WHY?? I thought this was in memory only?! Problem B: The last line of code, if I do select * from @WTF... WITHOUT the [ ], it returns NOTHING. What is the significance of the [ ]? I need serious help. I'm losing my MIND! Thanks in advance.

    Read the article

  • Map the physical file path in asp.net mvc

    - by rmassart
    Hi, I am trying to read an XSLT file from disk in my ASP.Net MVC controller. What I am doing is the following: string filepath = HttpContext.Request.PhysicalApplicationPath; filepath += "/Content/Xsl/pubmed.xslt"; string xsl = System.IO.File.ReadAllText(filepath); However, half way down this thread on forums.asp.net there is the following quote HttpContext.Current is evil and if you use it anywhere in your mvc app you are doing something wrong because you do not need it. Whilst I am not using "Current", I am wondering what is the best way to determine the absolute physical path of a file in MVC? For some reason (I don't know why!) HttpContext doesn't feel right for me. Is there a better (or recommended/best practice) way of reading files from disk in ASP.Net MVC? Thanks for your help, Robin

    Read the article

  • Migrating from Physical SQL (SQL2000) To VMWare machine (SQL2008) - Transferring Large DB

    - by alex
    We're in the middle of migrating from a windows & SQL 2000 box to a Virtualised Win & SQL 2k8 box The VMWare box is on a different site, with better hardware, connectivity etc... The old(current) physical machine is still in constant use - I've taken a backup of the DB on this machine, which is 21GB Transfering this to our virtual machine took around 7+ hours - which isn't ideal when we do the "actual" switchover. My question is - How should I handle the migration better? Could i set up our current machine to do log shipping to the VM machine to keep up to date? then, schedule down time out of hours to do the switch over? Is there a better way?

    Read the article

  • How to specify physical path in ASPX page?

    - by salvationishere
    I am developing a C# VS 2008 / SQL Server 2008 ASP.NET Web Applications project. In one of my ASPX files I am trying to reference the Master file, which is actually located in the parent website. In other words, when I open the parent website, I see this project listed. But when I open this project separately, I do not see parent website and this project is the root. So now how do I use the Master file from the parent website? Currently, I have in my ASPX file: <%@ Page Language="C#" MasterPageFile="~/Site.Master" AutoEventWireup="true" CodeFile="EnhancedCreateUserWizard.aspx.cs" Inherits="Membership_EnhancedCreateUserWizard" Title="Untitled Page" %> But this won't work because it is a virtual path and since this project is the root, I can't access the Master file virtually. Instead I want to specify physical path. How accomplish I do this?

    Read the article

  • windows fails to allocate the amount of free physical memory returned by GlobalMemoryStatusEx

    - by avi
    hello! what i'm trying to do is get the free amount of physical memory allocate it and than manage it ( resizing it or delete it ) depending on what further calls to GlobalMemoryStatusEx return. and the problem : it works on 2 PCs with win 7 x64 one with 2G Ram ( on witch i was able to allocate like 1.3 GB) , the other has one 1GB RAM (max alloc was 630 MB). it fails on the third one with 3GB of ramm. I can't find the problem. !! i tried google!! any solution?

    Read the article

  • Best strategy for moving data between physical tiers in ASP.net

    - by Pete Lunenfeld
    Building a new ASP.net application, and planning to separate DB, 'service' tier and Web/UI tier into separate physical layers. What is the best/easiest strategy to move serialized objects between the service tier and the UI tier? I was considering serializing POCOs into JSON using simple ASP.net pages to serve the middle tier. Meaning that the UI/Web tier will request data from a (hidden to the outside user) web server that will return a JSON string. This kind of JSON 'emitter' seems easily testable. It also seems easily compressible for efficiently moving data over the WAN between tiers. I know that some folks use .asmx webservices for this kind of task, but this seems like there is excess overhead with SOAP, and the package is not as human readable (testable) as POCOs serialized as JSON. Others are using more complex technology like WCF which we have never used. Does anyone have advice for choosing a method for moving data/objects between the data (db) tier and the web (UI) tier over the WAN using .net technologies? Thanks!!!

    Read the article

  • Animation using AniMate with Unity3D doesn't interact with physical objects

    - by Albz
    I'm designing a maze with Unity3D. The maze has a number of bifurcations and the player will stop before each bifurcation and simply choose left or right. Then an automatic animation will move the player through the next bifurcation till the end of the maze (or till a dead end). To animate the player I'm using AniMate and C# in my Unity project. Using AniMate I'm simply creating a point-to-point animation for each bifurcation (e.g. mage below: from the start/red arrow to point 5) My problem is that my animation script (associated to the "First Person Controller") is not working properly since physics is not respected (the player passes through walls). If in the same project I enable the standard character controls in Unity, then I can navigate in the maze with the physical contrains of walls etc... (i.e. I have colliders). This is an example of the code I'm using when I press left to pass from starting point, trough point 1 to point 2: void FixedUpdate () { if (Input.GetKey(KeyCode.LeftArrow)) { //To point 1 Hashtable props = new Hashtable(); props.Add("position", new Vector3(756f,112f,1124f)); props.Add("physics", true); Ani.Mate.To(transform, 2, props); //To point 2 Hashtable props2 = new Hashtable(); props2.Add("position", new Vector3(731f,112f,1124f)); props2.Add("physics", true); Ani.Mate.To(transform, 2, props2); } } What happens practically when I press the left arrow button is that the player moves directly to point 2 using a straight line passing through the wall. I tried to pass to AniMate "Physics = true" but it doesn't seem to help. Any idea on how to solve this issue? Alternatively... any hint on how to have a more optimized code and just use a series of vector3 coordinates (one for each point) to obtain the simple animation I want without having to declare new Hashtable(); etc... every time? I chose AniMate simply because 1. I'm a beginner with Unity 2. I don't need complex animations (e.g. I don't need to use iTween), just fixed animations along straight lines and I need something really simple and quick to implement in a script. However, if someone has an equally simple solution it will be welcome. thank you in advance for your help

    Read the article

  • Delphi Pascal - Using SetFilePointerEx and GetFileSizeEx, Getting Physical Media exact size when reading as a file

    - by SuicideClutchX2
    I am having trouble understanding how to delcare GetFileSizeEx and SetFilePointerEx in Delphi 2009 so that I can use them since they are not in the RTL or Windows.pas. I was able to compile with the following: function GetFileSizeEx(hFile: THandle; lpFileSizeHigh: Pointer): DWORD; external 'kernel32'; Then using GetFileSizeEx(PD, Pointer(DriveSize)); to get the size. But could not get it to work, the disk handle I am using is valid and I have had no problem reading the data or working under the 2gb mark with the older API's. GetFileSize of course returns 4294967295. I have had greater trouble trying to use SetFilePointerEx with the data types it uses. The overall project needs to read the data from a flash card, which is not a problem at all I can do this. My problem is that I can not find the length or size of the media I will be reading. I have code I have used in the past to do this with media under 2GB. But now that I need to read media over 2GB it is a problem. If you still dont understand I am dumping a card with all data including the boot record, etc. This is the code I would normally use to read from the physical disk to grab say the boot record and dump it to file: SetFilePointer(PD,0,nil,FILE_BEGIN); SetLength(Buffer,512); ReadFile(PD,Buffer[0],512,BytesReturned,nil); I just need to figure out how to find the end of an 8gb card and so on as well as being able to set a file pointer beyond the 2gb barrier. I guess any help in the external declarations as well as understand the values that SetFilePointerEx uses (I do not understand the whole High Low thing) would be of great help. var Form1: TForm1; function GetFileSizeEx(hFile: THandle; var FileSize: Int64): DWORD; stdcall; external 'kernel32'; implementation {$R *.dfm} function GetLD(Drive: Char): Cardinal; var Buffer : String; begin Buffer := Format('\\.\%s:',[Drive]); Result := CreateFile(PChar(Buffer),GENERIC_READ Or GENERIC_WRITE,FILE_SHARE_READ,nil,OPEN_EXISTING,0,0); If Result = INVALID_HANDLE_VALUE Then begin Result := CreateFile(PChar(Buffer),GENERIC_READ,FILE_SHARE_READ,nil,OPEN_EXISTING,0,0); end; end; function GetPD(Drive: Byte): Cardinal; var Buffer : String; begin If Drive = 0 Then begin Result := INVALID_HANDLE_VALUE; Exit; end; Buffer := Format('\\.\PHYSICALDRIVE%d',[Drive]); Result := CreateFile(PChar(Buffer),GENERIC_READ Or GENERIC_WRITE,FILE_SHARE_READ,nil,OPEN_EXISTING,0,0); If Result = INVALID_HANDLE_VALUE Then begin Result := CreateFile(PChar(Buffer),GENERIC_READ,FILE_SHARE_READ,nil,OPEN_EXISTING,0,0); end; end; function GetPhysicalDiskNumber(Drive: Char): Byte; var LD : DWORD; DiskExtents : PVolumeDiskExtents; DiskExtent : TDiskExtent; BytesReturned : Cardinal; begin Result := 0; LD := GetLD(Drive); If LD = INVALID_HANDLE_VALUE Then Exit; Try DiskExtents := AllocMem(Max_Path); DeviceIOControl(LD,IOCTL_VOLUME_GET_VOLUME_DISK_EXTENTS,nil,0,DiskExtents,Max_Path,BytesReturned,nil); If DiskExtents^.NumberOfDiskExtents > 0 Then begin DiskExtent := DiskExtents^.Extents[0]; Result := DiskExtent.DiskNumber; end; Finally CloseHandle(LD); end; end; procedure TForm1.Button1Click(Sender: TObject); var PD : DWORD; BytesReturned : Cardinal; Buffer : Array Of Byte; myFile: File; DriveSize: Int64; begin PD := GetPD(GetPhysicalDiskNumber(Edit1.Text[1])); If PD = INVALID_HANDLE_VALUE Then Exit; Try GetFileSizeEx(PD, DriveSize); //SetFilePointer(PD,0,nil,FILE_BEGIN); //etLength(Buffer,512); //ZeroMemory(@Buffer,SizeOf(Buffer)); //ReadFile(PD,Buffer[0],512,BytesReturned,nil); //AssignFile(myFile, 'StickDump.bin'); //ReWrite(myFile, 512); //BlockWrite(myFile, Buffer[0], 1); //CloseFile(myFile); Finally CloseHandle(PD); End; end;

    Read the article

  • lshw tells me my processor is a 64 bits but my motherboard has a 32 bits width

    - by bpetit
    Recently I noticed lshw tells me a strange thing. Here is the first part of my lshw output: bpetit-1025c description: Notebook product: 1025C (1025C) vendor: ASUSTeK COMPUTER INC. version: x.x serial: C3OAAS000774 width: 32 bits capabilities: smbios-2.7 dmi-2.7 smp-1.4 smp configuration: boot=normal chassis=notebook cpus=2 family=Eee PC... *-core description: Motherboard product: 1025C vendor: ASUSTeK COMPUTER INC. physical id: 0 version: x.xx serial: EeePC-0123456789 slot: To be filled by O.E.M. *-firmware description: BIOS vendor: American Megatrends Inc. physical id: 0 version: 1025C.0701 date: 01/06/2012 size: 64KiB capacity: 1984KiB capabilities: pci upgrade shadowing cdboot bootselect socketedrom edd... *-cpu:0 description: CPU product: Intel(R) Atom(TM) CPU N2800 @ 1.86GHz vendor: Intel Corp. physical id: 4 bus info: cpu@0 version: 6.6.1 serial: 0003-0661-0000-0000-0000-0000 slot: CPU 1 size: 798MHz capacity: 1865MHz width: 64 bits clock: 533MHz capabilities: x86-64 boot fpu fpu_exception wp vme de pse tsc ... configuration: cores=2 enabledcores=1 id=2 threads=2 *-cache:0 description: L1 cache physical id: 5 slot: L1-Cache size: 24KiB capacity: 24KiB capabilities: internal write-back unified *-cache:1 description: L2 cache physical id: 6 slot: L2-Cache size: 512KiB capacity: 512KiB capabilities: internal varies unified *-logicalcpu:0 description: Logical CPU physical id: 2.1 width: 64 bits capabilities: logical *-logicalcpu:1 description: Logical CPU physical id: 2.2 width: 64 bits capabilities: logical *-logicalcpu:2 description: Logical CPU physical id: 2.3 width: 64 bits capabilities: logical *-logicalcpu:3 description: Logical CPU physical id: 2.4 width: 64 bits capabilities: logical *-memory description: System Memory physical id: 13 slot: System board or motherboard size: 2GiB *-bank:0 description: SODIMM [empty] product: [Empty] vendor: [Empty] physical id: 0 serial: [Empty] slot: DIMM0 *-bank:1 description: SODIMM DDR3 Synchronous 1066 MHz (0.9 ns) product: SSZ3128M8-EAEEF vendor: Xicor physical id: 1 serial: 00000004 slot: DIMM1 size: 2GiB width: 64 bits clock: 1066MHz (0.9ns) *-cpu:1 physical id: 1 bus info: cpu@1 version: 6.6.1 serial: 0003-0661-0000-0000-0000-0000 size: 798MHz capacity: 798MHz capabilities: ht cpufreq configuration: id=2 *-logicalcpu:0 description: Logical CPU physical id: 2.1 capabilities: logical *-logicalcpu:1 description: Logical CPU physical id: 2.2 capabilities: logical *-logicalcpu:2 description: Logical CPU physical id: 2.3 capabilities: logical *-logicalcpu:3 description: Logical CPU physical id: 2.4 capabilities: logical So here I see my processor is effectively a 64 bits one. However, I'm wondering how my motherboard can have a "32 bits width". I've browsed the web to find an answer, without success. I imagine it's just a technical fact that I don't know about. Thanks.

    Read the article

  • Blu-ray BD-R: Would you physically store it in a CaseLogic Wallet pocket?

    - by Rob
    I keep several backup copies of my material and files. For my DVDs, one set of copies is kept in a CaseLogic wallet folder pack, so that I can easily move this around when visiting friends, family or for business. This is highly convenient. The other sets are kept in their jewel cases in hard plastic see thru storage boxes. Although CaseLogic wallet material is designed to be abrasion free, their caveat is that external dust will be the cause of any blemishes. If hard dust gets in these pockets, which is inevitable, this will occasionally cause light hair like scratches on the disc surface as the discs are removed and returned for access to their contents. This is of no consequence as the laser and error correction can more than cope with this. I'm aware that the blu-ray spec requires anti-scratch in disc surfaces but was wondering that, given the smaller pits, would dust and light scratches from wallet storage cause more problems with blu-rays than they would with DVDs? I'm using Blu-ray BD-R and BD-R DL write once media.

    Read the article

  • Steps to take when technical staff leave

    - by Tom O'Connor
    How do you handle the departure process when privileged or technical staff resign / get fired? Do you have a checklist of things to do to ensure the continuing operation / security of the company's infrastructure? I'm trying to come up with a nice canonical list of things that my colleagues should do when I leave (I resigned a week ago, so I've got a month to tidy up and GTFO). So far I've got: Escort them off the premises Delete their email Inbox (set all mail to forward to a catch-all) Delete their SSH keys on server(s) Delete their mysql user account(s) ... So, what's next. What have I forgotten to mention, or might be similarly useful? (endnote: Why is this off-topic? I'm a systems administrator, and this concerns continuing business security, this is definitely on-topic.)

    Read the article

  • Open source system for swipe card access?

    - by Moduspwnens
    We're looking at replacing our campus-wide magnetic swipe card system with something more robust. The "programmer" side of me says there's got to be an open-source, scalable solution that already does this, but all I've been able to find are proprietary vendor-specific solutions. Ideally, it'd have the following: Based on some open standard that allows us to select from a wide selection of card readers (like IMAP or HTTP) Support different kinds of card access (magnetic strip, RFIDs, etc.) Future-proof (to the extent possible) The lack of information I'm finding leads me to believe I'm not searching for the right things... or such a solution doesn't exist. Is there not some basic, open-source solution to this (like MySQL for databases, or Moodle for an LMS, or Apache for a web server)?

    Read the article

  • Microsoft Ergonomic Keyboards With Card Readers?

    - by Steve
    When I started working at my current job I developed tendinitis in my wrists. Luckily that cleared up when I started using a Microsoft ergonomic keyboard. The problem is that where I work is moving to more security. We will need to stick a card into a slot to log into our PCs. They bought a bunch of new keyboards with these slots built in. All regular keyboards. Is there something like the Microsoft Ergonomic keyboard that comes with such a card slot? Thanks.

    Read the article

  • Do you work in your server room?

    - by Gary Richardson
    I once had a job offer from a company that wanted my workstation to be in the AC controlled, noisy server room with no natural light. I'm not sure what their motivation was. Possibly it made sense to them for me to be close to the servers, or possibly they wanted to save the desk space for other employees. I turned down the job (for many reasons, including the working environment). Is this a common practice? Do you work in your LAN room? How do you cope?

    Read the article

  • Does more heat generation mean more wear and tear?

    - by Suhail Gupta
    I read that hardware generally used on PC is not optimized for running Linux. That is the reason the machine emits large amount of heat and doesn't give the battery back up , that we will get while working on windows. ~REF Does it also mean more wear and tear of the hardware (when using linux as compared when using windows ) ? Note : I have personally experienced large heat emission while working with Fedora 16 and Ubuntu 11.xx on my laptop.

    Read the article

  • How much does it wear an SD card to be frequently removed/reinserted?

    - by jtbandes
    My digital camera (a Sony a55) stores photos on an SD card. When I want to transfer these to my computer (a mid-2010 MacBook Pro), I have two options: use the USB cable to connect the camera to the computer, or use the computer's built-in SD card reader. The camera's SD card slot is the standard click-in, click-out (spring-loaded) mechanism. My laptop has a simple slot into which the card slides with a little more resistance than the former (the card slides only about halfway in so it can be easily removed). I notice that the card's contacts now have some shiny marks from one or both of these card slots: Does this type of wear threaten to significantly damage the card? Should I avoid switching the card between slots frequently, to extend its lifetime?

    Read the article

< Previous Page | 4 5 6 7 8 9 10 11 12 13 14 15  | Next Page >