Search Results

Search found 16809 results on 673 pages for 'nathan long'.

Page 342/673 | < Previous Page | 338 339 340 341 342 343 344 345 346 347 348 349  | Next Page >

  • Wifi interface changes name seemingly at random

    - by ray_voelker
    I'm currently having some issues getting a wireless interface to work continuously under an install of Ubuntu 12.04.1 LTS. Some of the issues I'm experiencing include Connection will drop out after some time after it has initially worked. Interface will be a different name after a reboot. For example, wlan0 will become wlan4 when using the ifconfig -a command. Ubuntu will take a long time to boot, looking for network adapters. The purpose of this build is to function as a web kiosk in a library. The computer is supposed to boot up into a web browser, and allow for browsing of the catalog. For some reason this interface does not appear to be working as it should. Are there any explanations for some of these problems I'm having, and perhaps some solutions? The wireless card appears as this after doing an lspci ... Ralink corp. RT2561/RT61 802.11g PCI In the /etc/network/interfaces file I have the following configuration for the interface. auto wlan0 iface wlan0 inet dhcp wireless-essid UDwireless wireless-mode Managed Thanks in advance for help on this.

    Read the article

  • Hyper-V Virtual Disk Creation Taking Forver

    - by mnemosyn
    After some struggle, I finally managed to set up Hyper-V 2008 R2 on our server. So I connected to it using the Hyper-V Manager from a Windows 7 client and used the "New Virtual Machine Wizard". I set up a 350GB virtual hard disk. So I hit the "finish" button and the Hyper-V manager has been working for 24hours now, showing merely a dialog "Creating Disk". A console on the Hyper-V still reports 99.9% free space on the HD, but the machines HD LED flashes from time to time (making a rather idle impression, it's not flashing frenetically). Does this usually take this long? Is there a way to find out whether it's still working or just idling? Should I repeat the process? Guides on the net tell me to be patient, but 1d seems a bit extreme!?

    Read the article

  • Proper XAML for Windows 8 Applications [closed]

    - by Jaapjan
    Traditionally, my programs do their work in the background and when I do have to make an interface for some reason, they often do not need to be complex which means I can use a simple Windows Forms or console application. But lets be honest-- Windows Forms? That is so ... ancient! Instead I have been looking at Windows 8. A new interface, different, maybe better-- but fun to give a try. Which means XAML. Now, XAML isn't all that hard in concept. Panel here, button there-- A smattering of XML. My question in short: Where can I find resources that teach me how to write good XAML code for Windows 8 applications? The long version: How do I combine XAML constructs to achieve effects? Horizontal panels with multiple sections you can scroll through with your finger, the proper way? How should you use default style resources Windows 8 might give you by default? How do I properly create a panel with user info on the right? Left aligned stackpanels with embedded dockpanels? Yes? No? Why?

    Read the article

  • TechEd 2012: MVVM In XAML

    - by Tim Murphy
    Paul Sheriff was a real character at the start of his MVVM in XAML session.  There was a lot of sarcasm and self deprecation going on prior to the .  That is never a bad way to get things rolling right after lunch.  Then things got semi-serious. The presentation itself had a number of surprises, but not all of them had to do with XAML.  When he flipped over his company’s code generation tool it took me off guard.  I am used to generator that create code for a whole project, but his tools were able to create different types of constructs on demand.  It also made it easier to follow what he was doing than some of the other demos I have seen this week where people were using code snippets. Getting to the heart of the topic I found myself thinking that I may have found my utopia for application development in MVVM.  Yes, I know there is no such thing, but this comes closer than any other pattern I have learned about.  This pattern allows the application to have better separation of concerns than I have seen before.  This is especially true since you can leverage data binding.  I’m not sure why it has taken me so long to find time for this subject. As Paul demonstrated using this pattern with XAML gives you multi-platform reusable code when you leverage common utility classes and ModelView classes.  The one drawback I see is that you have to go to the lowest common denominator between the platforms you want to support, but you always have to weigh the trade offs. And finally, the Visual Studio nuggets just keep coming.  Even though it has been available for several generations of Visual Studio I have never seen someone use linked files within a solution.  It just goes to show that I should spend more time exploring the deeper features of each dialog. del.icio.us Tags: TechEd,TechEd 2012,MVVM,Paul Sheriff,Patterns,Visual Studio 2012

    Read the article

  • How to batch convert video files on OSX for AppleTV2 / iPhone4?

    - by Luke404
    I'd like to have a solution to batch convert video files to a format suitable for the AppleTV2, iPad2, iPhone4, while at the same time preserving as much quality as possible; I want a single output file that will play on both devices and also good for consumption by other Mac software (eg. Aperture, iMovie, iTunes). Batch processing is a requirement since I'm gonna convert many many files from different sources (mainly lots of videos captured by compact digital cameras, cell phones, and so on). I'm looking into ffmpeg and MEncoder (both installed via MacPorts), but I can't seem to find a suitable preset for libx264 even if everyone out there is talking about them. A different approach involving different software would be ok too as long as I can script it somehow and run it on a whole directory full of files to be converted.

    Read the article

  • Advanced TSQL training

    - by Dave Ballantyne
    Over the past few years, Ive had it on my to do list to write and deliver and full-scale SQLServer training course and not just an hour long bite size session at user groups and conferences.  To me, SQLServer development is not just knowing and remembering the syntax of commands.  Sometimes I semi-jest that i have “Written a merge statement without looking up the syntax”, but I know from my interactions on and off line that I am far from alone in this.  In any case we have an awesome tool in the internet which is great at looking things up. When developing SQL Server based solutions,  of more importance is knowing the internals of the engine.  SQL Server is a complex piece of software and we need to be able to understand to a fairly low level ( you can always dive deeper ) the choices that it makes and why it makes them in order to deliver performant, reliable, predictable and scalable systems to our customers and end-users. This is the view i shall be taking over two days in March (19th and 20th) in London and ,TBH, one I dont see taken often enough. Early bird discounts are available until 31Dec. Full details of the course and a high level view of the bullet points we shall be covering are available at the Technitrain site ( http://tinyurl.com/TSQLTraining )

    Read the article

  • How to find malicious IPs?

    - by alfish
    Cacti shows irregular and pretty steady high bandwidth to my server (40x the normal) so I guess the server is udnder some sort of DDoS attack. The incoming bandwidth has not paralyzed my server, but of course consuming the bandwidth and affects performance so I am keen to figure out the possible culprits IPs add them to my deny list or otherwise counter them. When I run: netstat -ntu | awk '{print $5}' | cut -d: -f1 | sort | uniq -c | sort -n I get a long list of IPs with up to 400 connections each. I checked the most numerous occurring IPs but they come from my CDN. So I am wondering what is the best way to help monitor the requests that each IP make in order to pinpoint the malicious ones. I am using Ubuntu server. Thanks

    Read the article

  • How can I run my program on a large number of computers? [closed]

    - by zenpoy
    I'm looking for a (preferably free) service for running an executable I wrote? It's not malicious, it's not a virus, it's not scam, and if this is really important I can upload the python source code instead. I wrote a small crawler to gather information regarding the style of web pages for my MA project, and I need a lot more data. EDIT Here is more information on my problem and how I approach on solving it, and where I'm stuck. As part of my research I'm trying to classify text based on it's style (font-family for now), my data is based web pages, so I wrote a client/server application - the client is a crawler that gathers this data and send it to the server. The problem is that like 99% of the internet is Arial, Verdana and Helvetica - other fonts are far more rare, so I need to spend very long time to gather enough data regarding these fonts. Hope this explains it.

    Read the article

  • Can a UPS give off an EM field capable of interfering with desktop components?

    - by Magsol
    I own an APC Back-UPS ES 750; it's about 4 years old, and is the only major component remaining in a question that has been boggling me for the last 18 months. (Yep, I originally posted this question about the possessed desktop, and while I marked a solution and closed the question, a week later the same problem returned) I've tried plugging the desktop straight into the wall (but left the other components plugged into the UPS) and the desktop still froze. Is it possible that the EM field generated by the UPS is interfering with my desktop components and causing these otherwise-unpredictable system freezes? To me this sounds like a long shot, but aside from my twin LCD monitors, that just about takes care of all the major components.

    Read the article

  • Are there any good examples of open source C# projects with a large number of refactorings?

    - by Arjen Kruithof
    I'm doing research into software evolution and C#/.NET, specifically on identifying refactorings from changesets, so I'm looking for a suitable (XP-like) project that may serve as a test subject for extracting refactorings from version control history. Which open source C# projects have undergone large (number of) refactorings? Criteria A suitable project has its change history publicly available, has compilable code at most commits and at least several refactorings applied in the past. It does not have to be well-known, and the code quality or number of bugs is irrelevant. Preferably the code is in a Git or SVN repository. The result of this research will be a tool that automatically creates informative, concise comments for a changeset. This should improve on the common development practice of just not leaving any comments at all. EDIT: As Peter argues, ideally all commit comments would be teleological (goal-oriented). Practically, if a comment is made at all it is often descriptive, merely a summary of the changes. Sadly we're a long way from automatically inferring developer intentions!

    Read the article

  • More Denali Execution Plan Warning Goodies

    - by Dave Ballantyne
    In my last blog, I showed how the execution plan in denali has been enhanced by 2 new warnings ,conversion affecting cardinality and conversion affecting seek, which are shown when a data type conversion has happened either implicitly or explicitly. That is not all though, there is more .  Also added are two warnings when performance has been affected due to memory issues. Memory spills to tempdb are a costly operation and happen when SqlServer is under memory pressure and needs to free some up. For a long time you have been able to see these as warnings in a profiler trace as a sort or hash warning event,  but now they are included right in the execution plan.  Not only that but also you can see which operator caused the spill , not just which statement.  Pretty damn handy. Another cause of performance problems relating to memory are memory grant waits.  Here is an informative write up on them,  but simply speaking , SQLServer has to allocate a certain amount of memory for each statement. If it is unable to you get a “memory grant wait”.  Once again there are other methods of analyzing these,  but the plan now shows these too. Don't worry that’s not real production code There is one other new warning that is of interest to me, “Unmatched Indexes”.  Once I find out the conditions under which that fires ill blog about it.

    Read the article

  • Need to transfer large video from Camera!-app to computer

    - by Henrik Söderlund
    I have a jailbroken iPhone 4S and am trying to transfer a 25minute long HD-video that I have recorded through SmugMug's Camera-Awesome (Camera!) app. Once recorded in the app, it stays within that app's interface until you choose to save it via the app onto the camera roll. When trying this option, the app just stalls, even when leaving it for an hour plus. I assume the video is too large to copy. I am trying the iExplorer app on my MacBookAir. I can find the Documents folder inside the Camera!-folder. But as soon as I access it to view the contents, it simply stalls the app completely. Probably after trying to read the enormous video. Is there a clever way to transfer this file onto the computer? I can use iFile on the iPhone to transfer through wifi, but I don't know the Camera! app's Documents folder location on the file system.

    Read the article

  • Proving file creation dates

    - by Nils Munch
    In a weird case surrounding copyrights of a software system I have developed, I use the fact that I have all the source files of the system in question, created long before I joined the company that claims to own the system. The company being sued by yours truely says that I have simply manipulated to files to appear to be from that date. Is it even possible to fake or manipulate creation dates ? And if so, how can I "prove" that the files really are that old ? Luckily, I stored my project on GitHub, whick confirmed the fact that the files are from that era, but that is besides the point. I run purely Apple OS X.

    Read the article

  • MP3 vs M4A (AAC): what is the audio codec for portable devices which gives maximum independence?

    - by akira
    A couple of years back MP3 was the most supported format for portable devices. Then Apple came along and wiped the floor of all the portable devices with the iPod as well as the iPhone. They clearly favour M4A (AAC). When to choose, right now, the 'best' audio codec to encode music to, which would you choose to achieve maximal independence of portable device vendors: MP3 or M4A? (I am well aware of Ogg (vorbis): no market (maybe this changes with HTML5 and more WebKit on portable devices), I am also aware of FLAC: I dont want to discuss long term storage.)

    Read the article

  • ISA bus on newer computers

    - by Kevin Ivarsen
    Are there companies that sell new computers that support old ISA bus expansion cards? We have an aging computer running DOS that operates some machinery via an ISA interface board. Updated versions of this board (e.g. PCI, USB) are not available, and I am concerned about the long-term reliability of the 8+ year old computers we currently keep around as backups. If these newer ISA-capable machines exist, are there any general gotchas to be aware of in terms of compatibility with older expansion boards, ability to run DOS, etc.?

    Read the article

  • Iozone: sensible settings for a server with lots of RAM

    - by Frank Brenner
    I have just acquired a server with: 2x quadcore Xeons 48G ECC RAM 5x 160GB SSDs on an LSI 9260-8i Before deploying the target platform, I'd like to collect as much benchmark data as possible, testing I/O with hardware RAID in various configurations, ZFS zRAID, as well as I/O performance on vSphere and with KVM virtualization. In order to see real disk I/O performance without cache effects, I tried running Iozone with a maximum file of more than twice the physical RAM as recommended in the documentation, so: iozone -a -g100G However, as one might expect, this takes far too long to be practicable. (I stopped the run after seven hours..) I'd like to reduce the range of record and file sizes to values that might reflect realistic performance for an application server, hopefully getting the run times to under an hour or so. Any ideas? Thanks.

    Read the article

  • Split a table in Word without losing row title

    - by Shane Hsu
    Word has the feature to repeat title row of a table when a table is so long that it spans a bunch of pages. I need to categorize my data into several pages, and I did that by splitting the table and insert page split to put them all in a page of itself. So now I got several page of data, but only the first page has title row. Is there anyway else to do this beside manually adding the title row to all the other pages? Original data: _________________ | Cat. Data | | 1 * | | 1 * | | 1 * | | 1 * | | 1 * | | 1 * | | 2 * | | 2 * | | 2 * | | 2 * | | 3 * | |___3______*______| And then turn it into: _________________ | Cat. Data | | 1 * | | 1 * | | 1 * | | 1 * | | 1 * | |___1______*______| Next page _________________ | Cat. Data | | 2 * | | 2 * | | 2 * | |___2______*______| Next Page _________________ | Cat. Data | | 3 * | |___3______*______|

    Read the article

  • delete multiple files on linux with spaces in file names

    - by raido
    I have a directory on my Linux box with over 10000 files that I have to delete. Running... sudo rm -rf /var/tmp/* Gives the error message... sudo: unable to execute /bin/rm: Argument list too long The solution to this is to run sudo find /var/tmp | xargs sudo rm This only works for files with no spaces in the file name. However, some of the files have names with spaces in them and they are not deleted. For example, if a file is named 'A File With Spaces in the Name.dat', Running the command gives me errors like this.... rm: cannot remove `/var/tmp/A': No such file or directory rm: cannot remove `File': No such file or directory rm: cannot remove `With': No such file or directory rm: cannot remove `Spaces': No such file or directory rm: cannot remove `in': No such file or directory rm: cannot remove `the': No such file or directory rm: cannot remove `Name.dat': No such file or directory How do I pass the complete file path to xargs sudo rm without breaking up the file name.

    Read the article

  • QR Codes and Short Links - Please Take A Look [closed]

    - by Joe Turner
    I'm looking for a way to create a QR Code and a shortened link when a form is submitted. I have the QR Code bit, but the link is too long for me and the QR Code looks scary and complicated. The way it works is; the user types in (in this instance) a contract number. Then, a folder is created on the server of that contract number. (www.mysite.com/QR/$contractnumber). Then, using PHP again, I create a QR Code through Google because I know that every QR code will be linking to the same place, just a different ending of the link. The only bit that changes is the $POST... I was wondering if there was a way to shorten the link before it goes to Google? It would have to be through php. The user enters the contact number in the form, then that number(usually around 5/6 digits) will be entered into a already existing command? I'm not an expert in anything, I just know some really random snippets of code... And HTML and CSS, of course. Any help would be appreciated and judging by the few days I have been searching this, I think it might help a few people in the future. I would also like to confirm that the solution can't be one of this visual URLShorteners. If it is, it just needs to be the back-end of it, built into a existing form and QR Generator. Simple?

    Read the article

  • An entry-level programmer's best option [on hold]

    - by user134409
    I am facing a puzzle and I am not sure the best way to make a decision. In my spare time besides playing video games I got around to develop some games, nothing fancy, just small projects to get a better grasp at programming. After I finished college and got my BA in Computer Science, I got a job as web developer at a small firm. The next few months were very stressful as I had no previous experience and tried my best to make up for it. But after 6 months my boss told me I was inefficient and not very independent and let me go. To my credit, the help from the senior was very limited, I did learn a lot but I have learned by myself. For example they told me to do a UI in BackboneJS and I took me a while but I got it working (even if it was poorly designed). But I managed to do it all by myself because my senior was very busy and he did not have time even for my questions. Now I have found a new job again in web development but I am very afraid of what is going to happen next. I am afraid because I don't want to take the job and then be fired again after a couple of months, I get the feeling that this will be very bad on my CV, job hopping is like a red flag. They want to hire me but I am aware that they are working with new technologies and maybe I will end up not coping with it. So the question is: Should a entry-level programmer be better off with a starting job in QA, testing and work his way from there? I did learn allot from my first job but it was a moral blow when they decided to fire me. I do have a low self-esteem and I know my skills as a programmer are not that great. But I like programming and want to get better and I want to have a long career in it so that basically my pickle. Thank you in advance for the answers.

    Read the article

  • Should I sell video tutorials on my own or via publishers like lynda.com? [closed]

    - by Derfder
    I am asking this because I am deciding between two models right now. One way is to create video tutorials on my own (make some short free videos and long pay per download/stream videos) or sell them to lynda.com or tutsplus. The 2nd way is easier, because they will do all the boring business stuff, will host the files to download etc. In that case, everything I need is a good microphone and obey their guidelines. On the other side if I do it on my own, I have to do all the unwanted business stuff, pay the server and other stuff. This is quite a big downside, however, I will have all the videos under my control in the future. I know that lynda.com has bigger attention and marketing that I am capable, but if you take e.g. phpvideotutrials.com (r.i.p ;), I think Leigh was very successful with relatively small budget. The interesting question will be the cost or how much will they pay me. Would it be less than if I sell it myself+monthly server hosting+other expenses? Any advice from people who actively sell their videos to some companies or do it on they own is highly appreciated.

    Read the article

  • Domain user periodically can't login, but only temporarily?

    - by Josh
    Ok, this is a strange one that I'm having trouble replicating letting alone solving. I have a user who uses two computers (both XP sp3) on the domain with a roaming profile. She has no problems on her personal computer but occasionally needs to use a shared computer. On this shared computer she is sometimes (~once a week) unable to login with the error message "Username or password incorrect. Check username password and domain and try again." I've checked when this happens and her username and password are indeed entered correctly. Now the strange part - if someone else logs in to the computer (which so far always works) and then logs out she is able to log in after that. This problem began after a recent and long overdue password change. I've tried to replicate this problem after a reboot, or after another user logs out to no avail. Any suggestions on troubleshooting or replicating this one? Anyone experienced something similar?

    Read the article

  • Could crosslinking using very general anchor texts be a reason for a drop in rankings?

    - by webmasters
    I have crosslinked 20 sites and I thought I have been penalized for this, asked this question and some experienced members told me maybe that crosslinking may not necessarily be the reason. The sites are on same host, different C class IP and every site in linked to each other. Each site targets long tail kewords. Site 1 - BMW Used Cars - and my area Site 2 - WW Used Cars - and my area And so on... When I crosslinked them (in the sidebar), I did it for the users; instead of repeating the terms used cars and my location over and over (since my users are targeted) I just crosslinked them using the brand: BMW, WW. Targeting locally, my niches are not overly competitive, so I did not need to many external links to rank on various positions on the 1st page. I'm thinking that when I chose to link using only the brand, google might have thought I wanted to actually rank for BBW and WW, hence the drop in my targeted local traffic. Could this be? I now have no-followed the links and I am noticing a slight recovery, but if it's not a interlinking penalty it would be a shame not to benefit from my links.

    Read the article

  • Printing documents on HP printer very slow

    - by maxim
    I have a strange problem with HP drivers. I have configured a HP printer 2025dn with LAN connection, on 3 pc, using the cd rom of HP driver. All works well, but sometimes for certain documents the time for printing is very slow and long. I observe during this situation, in task manager, a process called rundll32.exe loads CPU at 100%. If I kill that process the printer starts printing fast the document in the queue. I wondering the reason of this strange behaviour.

    Read the article

  • HAProxy being killed with more that 54,000 connections

    - by Olly
    I am trying to run HAProxy (1.4.8) on a EC2 machine running Ubuntu 10.04. I need HAProxy to be able to handle many thousands of long-running persistent connections (websockets). With the current setup HAProxy gets killed at around 54,300 connections (roughly). If I am running HAProxy in the foreground, the only output is "Killed". Am I right in thinking this is the Kernel killing the process? Is this because it is out of resources? Can I increase the resources? The CPU and memory consumption are low with 50,000 connections, so I don't suspect either of these. How can I prevent this from happening?

    Read the article

< Previous Page | 338 339 340 341 342 343 344 345 346 347 348 349  | Next Page >