Search Results

Search found 227 results on 10 pages for 'parallels'.

Page 9/10 | < Previous Page | 5 6 7 8 9 10  | Next Page >

  • Fedora Core 6 Migration

    - by Matthew Sprankle
    I am at a loss as to what I should to for this server. I need it to run php5.3 and corresponding version of mysql. I received a client today through work that is using Fedora core 6 running 10 very small websites on some very hodge podge setup. My original idea was just upgrade to php5.3. I have yum (installed 3.0.8) reconfigured for the fedora archive. The latest version of php it allows is 5.1.8. I am still relatively new to server setups and am nervous about wiping their server to upgrade it. Since it is about 6-8 years old I'm not sure if it will even support the newest version of fedora. The server specs are: Parallels Plesk Panel version 9.5.4 Operating system Linux 2.6.9-023stab048.4-smp CPU GenuineIntel, Intel(R) Xeon(R)CPU E5335 @ 2.00GHz (10gb disk space and 1gb of memory). I use fedora for my personal server so I was a little familiar with it. I haven't done anything too extravagant. Is there a way I can escape this nightmare with installing php5.3 or do I need to migrate these sites to a new server?

    Read the article

  • Developer’s Life – Every Developer is a Batman

    - by Pinal Dave
    Batman is one of the darkest superheroes in the fantasy canon.  He does not come to his powers through any sort of magical coincidence or radioactive insect, but through a lot of psychological scarring caused by witnessing the death of his parents.  Despite his dark back story, he possesses a lot of admirable abilities that I feel bear comparison to developers. Batman has the distinct advantage that his alter ego, Bruce Wayne is a millionaire (or billionaire in today’s reboots).  This means that he can spend his time working on his athletic abilities, building a secret lair, and investing his money in cool tools.  This might not be true for developers (well, most developers), but I still think there are many parallels. So how are developers like Batman? Well, read on my list of reasons. Develop Skills Batman works on his skills.  He didn’t get the strength to scale Gotham’s skyscrapers by inheriting his powers or suffering an industrial accident.  Developers also hone their skills daily.  They might not be doing pull-ups and scaling buldings, but I think their skills are just as impressive. Clear Goals Batman is driven to build a better Gotham.  He knows that the criminal who killed his parents was a small-time thief, not a super villain – so he has larger goals in mind than simply chasing one villain.  He wants his city as a whole to be better.  Developers are also driven to make things better.  It can be easy to get hung up on one problem, but in the end it is best to focus on the well-being of the system as a whole. Ultimate Teamplayers Batman is the hero Gotham needs – even when that means appearing to be the bad guys.  Developers probably know that feeling well.  Batman takes the fall for a crime he didn’t commit, and developers often have to deliver bad news about the limitations of their networks and servers.  It’s not always a job filled with glory and thanks, but someone has to do it. Always Ready Batman and the Boy Scouts have this in common – they are always prepared.  Let’s add developers to this list.  Batman has an amazing tool belt with gadgets and gizmos, and let’s not even get into all the functions of the Batmobile!  Developers’ skills might be the knowledge and skills they have developed, not tools they can carry in a utility belt, but that doesn’t make them any less impressive. 100% Dedication Bruce Wayne cultivates the personality of a playboy, never keeping the same girlfriend for long and spending his time partying.  Even though he hides it, his driving force is his deep concern and love for his friends and the city as a whole.  Developers also care a lot about their company and employees – even when it is driving them crazy.  You do your best work when you care about your job on a personal level. Quality Output Batman believes the city deserves to be saved.  The citizens might have a love-hate relationship with both Batman and Bruce Wayne, and employees might not always appreciate developers.  Batman and developers, though, keep working for the best of everyone. I hope you are all enjoying reading about developers-as-superheroes as much as I am enjoying writing about them.  Please tell me how else developers are like Superheroes in the comments – especially if you know any developers who are faster than a speeding bullet and can leap tall buildings in a single bound. Reference: Pinal Dave (http://blog.sqlauthority.com)Filed under: PostADay, SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, T SQL Tagged: Developer, Superhero

    Read the article

  • Blogging: MacJournal & Windows Live Writer

    - by Jeff Julian
    One thing I have learned about using a Mac is that Apple does not produce very many free applications. The ones they do are typically not full featured and to get the full feature you need to buy their upgraded version. For example, when it comes to Photo editing and cataloging, iPhoto is not a solution for large files or RAW processing, you need Aperture which is a couple hundred dollars. I am not complaining because I like it when an application has a product team who generates revenue with it, because the chance of them being around longer seems to be higher. What is my point in all of this? Apple does not produce a product for blogging/journaling like Microsoft does with Windows Live Writer. I love Windows Live Writer. If you are on a Windows box, it is a required tool in your toolbox if you publish to a blog. The cleanness of the interface, integration with most blog APIs and ability to Save Local or Publish as a Draft make capturing your thoughts for publishing now or later a very easy task. My hope is that Microsoft will port it to the Mac, but I don’t believe that will ever happen as it is not a revenue generating product and Microsoft doesn’t often port to a Mac besides Remote Desktop Connection and MSN Messenger. For my configuration I used to use only Boot Camp on my two MacBook Pros I have owned in the past three years because I’m a PC, but after four different rebuilds (not typically due to Windows, but Boot Camp or Parallels) I decided to move off the Boot Camp platform and to VMWare Fusion. This is a complete separate blog post that I should spec out in MacJournal, but I now always boot into the Mac OS and use Fusion for my AJI Software VM or my client’s VMs. It just seems to work better for me and I have a very nice way to backup my Windows environments with VMWare.Needless to say, there was need in my new laptop configuration for a blogging tool that worked natively on a Mac. I don’t like to power up my machine for writing a document or working on an image and need to boot up a VM just so I can use Windows. Some would say why not just use a Windows laptop and put the MBP on eBay? It is just a preference and right now, I like the Mac OS for day to day work. So in comes MacJournal, part of the current MacHeist package for $19.95 (MacJournal is normally $39.95). This product is definitely not WLW, but WLW is missing some features I like in MacJournal. I hope the price point comes down on MacJournal cause I could see paying $19.95 for it, but it is always hard for me to buy a piece of software for $39.95 when I can use something else. But I am a cheapskate when it comes to software packages. I suggest if you are using a Mac to drop what you are doing pick up the MacHeist bundle today before it is over, but if you are reading this later, than download the trial and see if MacJournal is a solution for you. If you have any other suggestions that are as nice or cheaper, please comment.Product LinksMacJournal by Mariners Software $39.95 (part of MacHeist bundle for $19.95 with only one day left)Windows Live Writer by MicrosoftThis post was created using MacJournal.[Update: The joys of formatting. Make sure if you are a Geekswithblogs.net member that you use this configuration to setup the Metablog formatting of paragraphs correctly]

    Read the article

  • The theory of evolution applied to software

    - by Michel Grootjans
    I recently realized the many parallels you can draw between the theory of evolution and evolving software. Evolution is not the proverbial million monkeys typing on a million typewriters, where one of them comes up with the complete works of Shakespeare. We would have noticed by now, since the proverbial monkeys are now blogging on the Internet ;-) One of the main ideas of the theory of evolution is the balance between random mutations and natural selection. Random mutations happen all the time: millions of mutations over millions of years. Most of them are totally useless. Some of them are beneficial to the evolved species. Natural selection favors the beneficially mutated species. Less beneficial mutations die off. The mutated rabbit doesn't have to be faster than the fox. It just has to be faster than the other rabbits.   Theory of evolution Evolving software Random mutations happen all the time. Most of these mutations are so bad, the new species dies off, or cannot reproduce. Developers write new code all the time. New ideas come up during the act of writing software. The really bad ones don't get past the stage of idea. The bad ones don't get committed to source control. Natural selection favors the beneficial mutated species Good ideas and new code gets discussed in group during informal peer review. Less than good code gets refactored. Enhanced code makes it more readable, maintainable... A good set of traits makes the species superior to others. It becomes widespread A good design tends to make it easier to add new features, easier to understand the current implementations, easier to optimize for performance...thus superior. The best designs get carried over from project to project. They appear in blogs, articles and books about principles, patterns and practices.   Of course the act of writing software is deliberate. This can hardly be called random mutations. Though it sometimes might seem that code evolves through a will of its own ;-) Does this mean that evolving software (evolution) is better than a big design up front (creationism)? Not necessarily. It's a false idea to think that a project starts from scratch and everything evolves from there. Everyone carries his experience of what works and what doesn't. Up front design is necessary, but is best kept simple and minimal, just enough to get you started. Let the good experiences and ideas help to drive the process, whether they come from you or from others, from past experience or from the most junior developer on your team. Once again, balance is the keyword. Balance design up front with evolution on a daily basis. How do you know what balance is right? Through your own experience of what worked and what didn't (here's evolution again). Notes: The evolution of software can quickly degenerate without discipline. TDD is a discipline that leaves little to chance on that part. Write your test to describe the new behavior. Write just enough code to make it behave as specified. Refactor to evolve the code to a higher standard. The responsibility of good design rests continuously on each developers' shoulders. Promiscuous pair programming helps quickly spreading the design to the whole team.

    Read the article

  • First Impressions of a MacBook (from a PC guy)

    - by dgreen
    Disclaimer: I've been a PC guy my entire working career. I'd probably characterize myself as a power user. Never afraid to bust out the console line. But working with a Mac is totally foreign to me. So for those Mac guys who are curious, this is how your world appears from the outside to a computer literate person :)My Macbook Air has arrived! And it's a thing of beauty:First, the specs: 13" MacBook Air, 2.0GHz Core i7 processor. Upgraded to 8GB of RAM for an additional $100, SSD flash storage  = 256GB. The plan is ultimately to use this baby for some iOS development but also some decent lifting in Windows with Visual Studio. Done a lot of reading  and between VMWare Fusion, Parallels and Bootcamp...I'm going to go with VMWare Fusion for $49.99And now my impressions (please re-read disclaimer before proceeding!):I open the box and am trying to understand exactly how the magsafe connector works (and how to disconnect it).  Why does it have two socket outlet plugs? Who knows.  I feel like Hansel in Zoolander. The files are "in" the computer.Stuck in my external hard drive (usb). So how do I get to the files? To the Googles!Argh...it can't read my external NTFS drive. Fat32 can't support field over 4GB…problematic since some of my existing VMWare image files are much larger than 4GB. Didn't see this coming.Three year old loves iPhoto. Super easy to use. Don't even know what I'm doing but I've already (accidentally) discovered the image filtering options. Fun stuff.First thing I downloaded ever => Chrome. I need something to ground me, something familiar. My token, if you will (sorry, gratuitous Inception joke).Ok, I get it… Finder == windows explorer. But where is my hierarchical structure? I miss the tree :(On that note, yeah…how do I see what "path" my files reside in? I'm afraid to know the answer. You know what scares more though…this notion of a smart folder. Feel like the godfather - just get the job done, I don't care how you handle it, I don't want to know...just get it done. What the hell is AirDrop?Mail…just worked. Still in shock that they have a free client for yahoo mail (please no yahoo jokes).mail -> deleting a message takes 5 seconds. Have they heard of async?"Command" key instead of "Control" ok, then what the $%&^! is the control key for then"aliases" == shortcuts I thinkI don't see the file system. And I'm scared. All these things I'm downloading…these .dmg files (bad name) where are they going? Can't seem to delete when they're doneUgh...realized need to buy a mini-to-vga adaptor if I want to use my external monitor ($13 on ebay, $39 in apple store).Windows docking is trickiest for me…this notion of detached windows with a menu bar at the top. I don't like this paradigm, it's confusing. But maybe because I've been using Windows for too long.Evernote, Dropbox desktop clients seem almost identical…few quirks here and there I need to get used to.iTunes is still a bit gross. In a weird way it's actually worse on a Mac if thats possible. This is not the MacBook's fault…this is a software design issue. Overall: UI will take some getting used to. Can't decide if this represents the future and I'm stuck in the past…or this is the past and I've been spoiled by the future (which would be Windows…don't be hating I happen to be very productive in Win7)  So there you go - my 90 minute first impression of the MacBook universe.

    Read the article

  • The Evolution of Oracle Direct EMEA by John McGann

    - by user769227
    John is expanding his Dublin based team and is currently recruiting a Director with marketing and sales leadership experience: http://bit.ly/O8PyDF Should you wish to apply, please send your CV to [email protected] Hi, my name is John McGann and I am part of the Oracle Direct management team, based in Dublin.   Today I’m writing from the Oracle London City office, right in the heart of the financial district and up to very recently at the centre of a fantastic Olympic Games. The Olympics saw individuals and teams from across the globe competing to decide who is Citius, Altius, Fortius - “Faster, Higher, Stronger" There are lots of obvious parallels between the competitive world of the Olympics and the Business environments that many of us operate in, but there are also some interesting differences – especially in my area of responsibility within Oracle. We are of course constantly striving to be the best - the best solution on offer for our clients, bringing simplicity to their management, consumption and application of information technology, and the best provider when compared with our many niche competitors.   In Oracle and especially in Oracle Direct, a key aspect of how we achieve this is what sets us apart from the Olympians.  We have long ago eliminated geographic boundaries as a limitation to what we can achieve. We assemble the strongest individuals across multiple countries and bring them together in teams focussed on a single goal. One such team is the Oracle Direct Sales Programs team. In case you don’t know, Oracle Direct EMEA (Europe Middle East and Africa) is the inside sales division in Oracle and it is where I started my Oracle career.  I remember that my first role involved putting direct mail in envelopes.... things have moved on a bit since then – for me, for Oracle Direct and in how we interact with our customers. Today, the team of over 1000 people is located in the different Oracle Direct offices around Europe – the main ones are Malaga, Berlin, Prague and Dubai plus the headquarters in Dublin. We work in over 20 languages and are in constant contact with current and future Oracle customers, using the latest internet and telephone technologies to effectively communicate and collaborate with each other, our customers and prospects. One of my areas of responsibility within Oracle Direct is the Sales Programs team. This team of 25 people manages the planning and execution of demand generation, leading the process of finding new and incremental revenue within Oracle Direct. The Sales Programs Managers or ‘SPMs’ are embedded within each of the Oracle Direct sales teams, focussed on distinct geographies or product groups. The SPMs are virtual members of the regional sales management teams, and work closely with the sales and marketing teams to define and deliver demand generation activities. The customer contact elements of these activities are executed via the Oracle Direct Sales and Business Development/Lead Generation teams, to deliver the pipeline required to meet our revenue goals. Activities can range from pan-EMEA joint sales and marketing campaigns, to very localised niche campaigns. The campaigns might focus on particular segments of our existing customers, introducing elements of our evolving solution portfolio which customers may not be familiar with. The Sales Programs team also manages ‘Nurture’ activities to ensure that we develop potential business opportunities with contacts and organisations that do not have immediate requirements. Looking ahead, it is really important that we continue to evolve our ability to add value to our clients and reduce the physical limitations of our distance from them through the innovative application of technology. This enables us to enhance the customer buying experience and to enable the Inside Sales teams to manage ever more complex sales cycles from start to finish.  One of my expectations of my team is to actively drive innovation in how we leverage data to better understand our customers, and exploit emerging technologies to better communicate with them.   With the rate of innovation and acquisition within Oracle, we need to ensure that existing and potential customers are aware of all we have to offer that relates to their business goals.   We need to achieve this via a coherent communication and sales strategy to effectively target the right people using the most effective medium. This is another area where the Sales Programs team plays a key role.

    Read the article

  • Universities 2030: Learning from the Past to Anticipate the Future

    - by Mohit Phogat
    What will the landscape of international higher education look like a generation from now? What challenges and opportunities lie ahead for universities, especially “global” research universities? And what can university leaders do to prepare for the major social, economic, and political changes—both foreseen and unforeseen—that may be on the horizon? The nine essays in this collection proceed on the premise that one way to envision “the global university” of the future is to explore how earlier generations of university leaders prepared for “global” change—or at least responded to change—in the past. As the essays in this collection attest, many of the patterns associated with contemporary “globalization” or “internationalization” are not new; similar processes have been underway for a long time (some would say for centuries).[1] A comparative-historical look at universities’ responses to global change can help today’s higher-education leaders prepare for the future. Written by leading historians of higher education from around the world, these nine essays identify “key moments” in the internationalization of higher education: moments when universities and university leaders responded to new historical circumstances by reorienting their relationship with the broader world. Covering more than a century of change—from the late nineteenth century to the early twenty-first—they explore different approaches to internationalization across Europe, Asia, Australia, North America, and South America. Notably, while the choice of historical eras was left entirely open, the essays converged around four periods: the 1880s and the international extension of the “modern research university” model; the 1930s and universities’ attempts to cope with international financial and political crises; the 1960s and universities’ role in an emerging postcolonial international development apparatus; and the 2000s and the rise of neoliberal efforts to reform universities in the name of international economic “competitiveness.” Each of these four periods saw universities adopt new approaches to internationalization in response to major historical-structural changes, and each has clear parallels to today. Among the most important historical-structural challenges that universities confronted were: (1) fluctuating enrollments and funding resources associated with global economic booms and busts; (2) new modes of transportation and communication that facilitated mobility (among students, scholars, and knowledge itself); (3) increasing demands for applied science, technical expertise, and commercial innovation; and (4) ideological reconfigurations accompanying regime changes (e.g., from one internal regime to another, from colonialism to postcolonialism, from the cold war to globalized capitalism, etc.). Like universities today, universities in the past responded to major historical-structural changes by internationalizing: by joining forces across space to meet new expectations and solve problems on an ever-widening scale. Approaches to internationalization have typically built on prior cultural or institutional ties. In general, only when the benefits of existing ties had been exhausted did universities reach out to foreign (or less familiar) partners. As one might expect, this process of “reaching out” has stretched universities’ traditional cultural, political, and/or intellectual bonds and has invariably presented challenges, particularly when national priorities have differed—for example, with respect to curricular programs, governance structures, norms of academic freedom, etc. Strategies of university internationalization that either ignore or downplay cultural, political, or intellectual differences often fail, especially when the pursuit of new international connections is perceived to weaken national ties. If the essays in this collection agree on anything, they agree that approaches to internationalization that seem to “de-nationalize” the university usually do not succeed (at least not for long). Please continue reading the other essays at http://globalhighered.wordpress.com/

    Read the article

  • LinqToSql - Parallel - DataContext and Parallel

    - by Gregoire
    In .NET 4 and multicore environment, does the linq to sql datacontext object take advantage of the new parallels if we use DataLoadOptions.LoadWith? EDIT I know linq to sql does not parallelize ordinary queries. What I want to know is when we specify DataLoadOption.LoadWith, does it use parallelization to perform the match between each entity and its sub entities? Example: using(MyDataContext context = new MyDataContext()) { DataLaodOptions options =new DataLoadOptions(); options.LoadWith<Product>(p=>p.Category); return this.DataContext.Products.Where(p=>p.SomeCondition); } generates the following sql: Select Id,Name from Categories Select Id,Name, CategoryId from Products where p.SomeCondition when all the products are created, will we have a categories.ToArray(); Parallel.Foreach(products, p => { p.Category == categories.FirstOrDefault(c => c.Id == p.CategoryId); }); or categories.ToArray(); foreach(Product product in products) { product.Category = categories.FirstOrDefault(c => c.Id == product.CategoryId); } ?

    Read the article

  • Using a Mac for cross platform development?

    - by mdec
    Who uses Macs for cross-platform development? By cross platform I essentially mean you can compile to target Windows or Unix (not necessarily both at the same time). I understand that this also has a lot to do with writing portable code, but I am more interested in people's experience with Mac OS X to develop software. I understand that there are a range of IDEs to choose from, I would probably use Eclipse (I like the GCC toolchain) however Xcode seems to be quite popular. Could it be used as described above? At a pinch I could always virtualise with VirtualBox or VMware Player or parallels to use Visual Studio (or dual boot for that matter). Having said that I am open to any other suggested compilers (with preferably an IDE that uses GCC.) Also with the range of Macs available, which one would you recommend? I would prefer a laptop (as I already have a desktop) but am unsure of reasonable specifications. If you are currently using a Mac to do development, I would love to hear what you develop on your Mac and what you like and don't like about it. I would primarily be developing in C/C++/Java. I am also looking to experiment with Boost and Qt, so I'm interested in hearing about any (potential) compatibility issues. If you have any other tips I'd love you hear what you have to say.

    Read the article

  • what is this internet explorer (javascript?) syntax error -2146827286?

    - by ben
    Hi everyone, So, I've been trying to troubleshoot this bug where some big percentage of windows users who are on my ajax (jquery) web app are not able to play. I haven't been able to reproduce it on my end with a windows 7 IE8 running in a parallels vm. The main problem seems to be in the javascript somewhere because what users are complaining about is an ajax button isn't working. They click it and nothing happens, so either the event isn't firing, or my ajax call is failing, and possibly the return from the ajax could be failing. After trying some ideas, a friend suggested I check out damnit! https://damnit.jupiterit.com/ which will catch exceptions in javascript and email them to you. This is a pretty awesome tool! So, now i have a little more data, but, am stuck. Basically it seems like the majority of the exceptions seem to be complaining about a syntax error. I will paste the samples below. message: Syntax error number: -2146827286 description: Syntax error Browser: Mozilla/4.0 (compatible; MSIE 8.0; Windows NT 5.1; Trident/4.0; .NET CLR 1.1.4322; .NET CLR 2.0.50727; .NET CLR 3.0.4506.2152; .NET CLR 3.5.30729; InfoPath.2; .NET CLR 1.0.3705; OfficeLiveConnector.1.3; OfficeLivePatch.0.0) What's interesting is that the syntax error is consistently occurring in browsers reporting MSIE 8.0 but with windows vista, xp, and below, so older OS with the latest IE. Does anyone know of this error? Could this possibly be some weird slow computer/slow internet connection thing where may be my javascript files aren't fully loaded before I call the functions. I am using the jquery $(document).ready() to wait before I setup anything.

    Read the article

  • Source Control - XCode - Visual Studio 2005/2008 / 2010

    - by Mick Walker
    My apologies if this has been asked before, I wasnt quite sure if this question should be asked on a programming forum, as it more relates to programming environment than a particular technology, so please accept my (double) appologies if I am posting this in the wrong place, my logic in this case was if it effects the code I write, then this is the place for it. At home, I do a lot of my development on a Mac Pro, I do development for the Mac, iPhone and Windows on this machine (Xcode & Visual Studio - (multiple versions installed in bootcamp, but generally I run it via Parallels)). When visiting a client, I have a similar setup, but on my MacBook Pro. What I want is a source control solution to install on the Mac Pro, that will support both XCode and multiple versions of visual studio, so that when I visit a client, I can simply grab the latest copy from source control via the MacBook Pro. Whilst visiting the client, he / she may suggest changes, and minor ones I would tend to make on site, so I need the ability to merge any modified code back into the trunk of the project / solution when I return home. At the moment, I am using no source control at all, and rely on simply coping folders and overwriting them when I return from a client- thats my 'merge'!!! I was wondering if anyone had any ideas of a source provider I could use, which would support both Windows and Mac development environments, and is cheap (free would be better).

    Read the article

  • Troubleshooting ASP .NET Application on Shared Hosting

    - by James
    Hi, My company has a CRM site hosted externally on a shared server and recently it has been very problematic. Users are being logged out randomly, sometimes only seconds after logging in. We are also getting viewstate validation errors at times. Both problems seem to occur more often when there are two or more people logged in at the same time, but I can't really see any particular pattern. I am using log4net to track the application state and from what I can tell it seems that the application is frequently restarting, causing all sorts of issues. I can see log messages from the Application_Start event handler but there is not always a corresponding message from the Application_End event handler. There is also logging code in the Application_Error event handler but it is not catching anything at the time of the restart. These errors started to occur soon after we moved our site to this shared server, although I don't remember it being this bad at first. Any advice on how to track down these problems would be appreciated. The server is running Windows Server 2003 and IIS 6.0. Sadly I don't have access to the server other than through Parallels Plesk and it doesn't seem to have any useful diagnostic information.

    Read the article

  • Django Development Environment Setup Questions

    - by Ross Peoples
    Hello, I'm trying to set up a good development environment for a Django project that I will be working on from two different physical locations. I have two Mac machines, one at home and one at work that I do most of my development on. I currently host a Ubuntu virtual machine on one of the machines to host the Django environemnt, install DropBox on it, and edit source code from my Mac. When I save the code file, the changes get synced over DropBox to the Ubuntu VM and the Django development server automatically restarts because of the change. This method has worked well in the past, but I am starting to use DropBox for a lot of other things now and don't want all of that to be downloaded on every virtual machine I use. Plus, I want to start using Eclipse + PyDev to be able to debug code and have code completion. Currently, I use TextEdit which is great, but doesn't support debugging or completion. So what are my options? I thought about setting up a Parallels VM on a thumb drive that has my entire environment on it (Eclipse included), but that has its own problems. Any other thoughts?

    Read the article

  • assign characters to key combinations in XP or Visual Studio .Net

    - by cpj
    I'm running Mac OSX on a MacBookPro (UK keyboard). I run windows XP under parallels in a VM. I run Visual Studio .Net 2003 and 2008 in XP in the VM when i need to. I have English United Kingdom and English United states keyboards setup in XP. (they switch sometimes for no apparent reason) There is no hash "#" key on my mac's keyboard. However, in OSX I can get a hash with an alt+3 key combination. But In Windows XP... I can not make a "#" character. I can go to the character map in windows and copy a hash.. switch into OSX and copy a hash.. search in code and copy a hash.. but I can not make a hash in XP using my keyboard without typing U+0023: ... which you can imagine is annoying. coding anything with hash symbols is becoming a choir. Anyone got any advice or key mapping tricks I can use to get hash characters working in XP using my mac UK keyboard?

    Read the article

  • MySQL Execution Time Spikes

    - by Brett
    I am having issues with MySQL all of the sudden today. Details: OS: CentOS release 5.7 Server type: Parallels virtuozzo container running on mediatemple DV 4.0 package Average total memory usage: <500mb Total memory usage allowed: 1gb (part of shared pool for emergency only, users are only guaranteed 500mb) Processor: 1ghz Main database sizes with most usage: 275mb & 107mb server stack: nginx 1.0.10, mysql 5.1.54, php 5.3.8 with php-fpm innodb_buffer_pool_size=100M php-fpm max children: 5 Webapps: custom php-based sites, magento & drupal slow query timeout is set to 1 second Steps I completed towards diagnosis: Cannot restart container yet - I will try later tonight when our domestic traffic has dropped Enabled mysql and php-fpm slowlog. Found functions that did DB queries in php-fpm slowlog were taking over 1s to complete at times Found some simple queries in mysql slowlog taking well over 1s to complete that should take less than 1s. Most interesting - execution time seems to spike at times. A query will take .2s a couple times, then one time it will take 8s to run the same query. These results were verified by running raw SQL queries through mysql command line. Top does not reveal anything too interesting Only resource related thing i can see is load averages much higher than normal Up until today, mysql has been fine, there have been no major changes to the db since yesterday. Sometimes things are so bad, I am seeing bad gateway errors after 60s of execution time. Innodb is doing on average 300-1400 reads/sec. Mysql is doing 3-10 queries/sec slow query count in 2 hours uptime is 171 (with slow timeout at 1 second) Tried restarting mysql, nginx, php-fpm multiple times For example: UPDATE `catalogsearch_query` SET `query_text` = 'EW 90', `num_results` = '7532', `popularity` = '99180', `redirect` = NULL, `synonym_for` = NULL, `store_id` = '1', `display_in_terms` = '1', `is_active` = '1', `is_processed` = '1', `updated_at` = '2012-05-08 21:38:31' WHERE (query_id='31'); This query took 17sec to complete one time, rest of the time around .079 sec. But varies, sometimes 1sec, sometimes .004 sec. This is running the same query, over and over with a couple seconds time in between each. Most tables are innodb, and sometimes I noticed the lock time taking 90% of the query execution time, but most of the time lock time is insignificant. Any idea what's going on here?

    Read the article

  • Unable to connect to SVN server on VPS : Subversion Configuration Problem

    - by Pritam Barhate
    Hello everybody, I purchased a VPS account (centos-5-x86_64) from Hostgator mainly for the purpose of setting up an online Subversion Server. This is the first time I am managing a VPS and my linux skills are not so great, but I have basic knowledge of the OS and have been using it on and off as desktop from last few years. So after foxing through a few tutorials online, as the first step I logged in using SSH root account provided by Hostgator and tried to run, yum install mod_dav_svn subversion As it turns out my account has Cpanel/WHM and since it some concept of easy apache straight forward procedure of yum install mod_dav_svn subversion won't work. After that I found out how it can be worked out by compiling the source and stuff. But the whole procedure looked long and scary. [I am just a linux nub]. so I decided I would just skip whole apache integration stuff and just access the server using svn:// protocol, anyways that's how I configure svn on our LAN. So I installed subversion using yum install subversion It installed fine. Then I created a folder /svn_repos/testproject and ran svnadmin create /svn_repos/testproject/ No Problems Using vi I changed svnserve.conf and passwd files for the repository and added a user with my name. Anonymous users don't have any access, authenticated users have write access. Then I started svnserve using svnserve -d then in same terminal window svn list svn://localhost/svn_repos/testproject Asks for authentication for realm, provided root password then for svn username and password. Provided both. The command returns nothing but exists properly. Returns nothing is understood I didn't import anything. But if try to access svn remotely using in another terminal: svn list svn://ip.add.of.server/svn_repos/testproject svn: Can't connect to host 'ip.add.of.server': Operation timed out Is what I get. Parallels Power Panel that I got from Hostgator reports that: The firewall is not active now. To activate the firewall, choose one of the firewall operation modes. So if firewall is not running and I can access svn using localhost, why the operation is timing out when I try to access svn from a remote machine? Experienced network admins please help. Thanks in advance. Also please suggest a good book which gives detailed information on configuring Dedicated servers + WHM and CPanel.

    Read the article

  • PHP versions warning on Plesk 11.0.9 upgrade on CentOS server

    - by Pixman
    I have a server turning on Plesk 10.4.4 and I want to upgrade it to 11.0.9. When I use the online upgrade tool, I have this warning: Parallels Panel pre-upgrade check... WARNING: You have a mixed set of 'php' and 'php53' packages installed. Installation or upgrade may fail or produce unexpected results. To resolve this issue run "sed -i.bak -e '/^\s*skip-bdb\s*$/d' /etc/my.cnf ; yum update 'php*' 'mysql*'". PHP Warning: Directive 'safe_mode' is deprecated in PHP 5.3 and greater in Unknown on line 0 I have run the code in ssh, but nothing changes. I have already searched all package names with "php", and I have this list: # yum list installed | grep php php-common.i386 5.3.13-5.el5.art installed php-pear.noarch 1:1.4.9-8.el5 installed php5-ioncube-loader.i386 4.0.7-11062118 installed php53.i386 5.3.3-13.el5_8 installed php53-cli.i386 5.3.3-13.el5_8 installed php53-devel.i386 5.3.3-13.el5_8 installed php53-gd.i386 5.3.3-13.el5_8 installed php53-imap.i386 5.3.3-13.el5_8 installed php53-mbstring.i386 5.3.3-13.el5_8 installed php53-mcrypt.i386 5.3.3-1.el5 installed php53-mysql.i386 5.3.3-13.el5_8 installed php53-pdo.i386 5.3.3-13.el5_8 installed php53-sqlite2.i386 5.3.2-11041315 installed php53-xml.i386 5.3.3-13.el5_8 installed psa-appvault-phpads.noarch 2.0.8-8203520080409011611 installed psa-appvault-phpbb.noarch 3.0.0-8200820080409011626 installed psa-appvault-phpbook.noarch 1.50-8203220080409011638 installed psa-appvault-phpbugtracker.noarch 1.19-8203820080416050605 installed psa-appvault-phpdig.noarch 1.85-8203120080409011645 installed psa-appvault-phpmoney.noarch 1.3-8204320080409011649 installed psa-appvault-phpmyfamily.noarch 1.4.1-8203420080409011655 installed psa-appvault-phpmyvisites.noarch 2.3-8202820080409011701 installed psa-appvault-phprojekt.noarch 5.2-8200820080409011713 installed psa-appvault-phpsurveyor.noarch 0.98-8204320080409011723 installed psa-appvault-phpwebsite.noarch 0.10.2-8203420080409011738 installed psa-appvault-phpwiki.noarch 1.3.11-8204320080409011808 installed psa-php53-configurator.i386 1.6.1-cos5.build1013111101.14 installed After verification in the Plesk file: panel_preupgrade_checker.php I think the warning is due to these lines: (from : panel_preupgrade_checker.php) foreach ($packages as $package) { $name = $package['name']; $hasPhp5 |= ($name == 'php' || strpos($name, 'php-') === 0); $hasPhp53 |= (strpos($name, 'php53') === 0); } Now, I think the problem is just due to the names of theses packages: php-common.i386 5.3.13-5.el5.art installed php-pear.noarch 1:1.4.9-8.el5 installed Can you help me to resolve this situation?

    Read the article

  • Issue using a "used" SSD as a Windows 8.1 Boot Drive

    - by EpiGrad
    So, I'm something of a Mac person, but decided to take a stab at this whole "build yourself a PC" thing - right now, the thing is assembled, posts just fine, and can get to the BIOS. The problem is the drive I want to use - I intended to use a 80 GB Corsair SSD I've had sitting around as the boot drive, and a new Samsung SSD for games and the like. So I boot using a Windows 8.1 install USB stick, and if the Samsung drive is plugged in, it happily offers to install Windows on it. The Corsair drive though, it's flipped out - I reformatted it as a blank NTFS drive (it was HFS for Mac purposes) and the BIOS can't see it, nor can the Windows installer. What's wrong, and how do I fix it? The tools at my disposal are: The current ASUS BIOS that came with my motherboard (a Z87I-Deluxe), a Mac running the latest OS X which can also boot to Windows 7 if needed via either Parallels or Bootcamp. Update 1: Update: Based on a friend's suggestion to switch SATA ports, Windows 8.1's installer can now see the drive as Drive 0, Partition 1, a 83.8 GB "Primary" partition. But when I click it and hit "Next", I get the following error: "We couldn't create a new partition or locate an existing one. For more information, see the Setup log files" - not that it gives any clue how to access those. Update 2: Following a trail of Google suggestions, I ended up going into advanced tools and just reformatting the drive as follows: Start DISKPART. Type LIST DISK and identify your SSD disk number (from 0 to n disks). Type SELECT DISK <n> where <n> is your SSD disk number. Type CLEAN Type CREATE PARTITION PRIMARY Type ACTIVE Type FORMAT FS=NTFS QUICK Type ASSIGN Type EXIT twice (one to get out of DiskPart, the other to exit the command line tool) Per these instructions. This goes well enough, but now I can select the disk for installation, and I get a new error: "Windows 8 cannot be installed to this disk. The selected disk has an MBR partition table. On EFI systems, Windows can only be installed to GP disks." So, Googling that, I do the following: select disk 0 clean convert gpt exit ...and we might have fixed it. Windows is at least trying to install now.

    Read the article

  • SQL Saturday 43 in Redmond

    - by AjarnMark
    I attended my first SQLSaturday a couple of days ago, SQLSaturday #43 in Redmond (at Microsoft).  I got there really early, primarily because I forgot how fast I can get there from my home when nobody else is on the road.  On a weekday in rush hour traffic, that would have taken two hours to get there.  I gave myself 90 minutes, and actually got there in about 45.  Crazy! I made the mistake of going to the main Microsoft campus, but that’s not where the event was being held.  Instead it was in a big Microsoft conference center on the other side of the highway.  Fortunately, I had the address with me and quickly realized my mistake.  When I got back on track, I noticed that there were bright yellow signs out on the street corner that looked like they said they were for SOL Saturday, which actually was appropriate since it was the sunniest day around here in a long time. Since I was there so early, the registration was just getting setup, so I found Greg Larsen who was coordinating things and offered to help.  He put me to work with a group of people organizing the pre-printed raffle tickets and stuffing swag bags. I had never been to a SQLSaturday before this one, so I wasn’t exactly sure what to expect even though I have read about a few on some blogs.  It makes sense that each one will be a little bit different since they are almost completely volunteer driven, and the whole concept is still in its early stages.  I have been to the PASS Summit for the last several years, and was hoping for a smaller version of that.  Now, it’s not really fair to compare one free day of training run entirely by volunteers with a multi-day, $1,000+ event put on under the direction of a professional event management company.  But there are some parallels. At this SQLSaturday, there was no opening general session, just coffee and pastries in the common area / expo hallway and straight into the first group of sessions.  I don’t know if that was because there was no single room large enough to hold everyone, or for other reasons.  This worked out okay, but the organization guy in me would have preferred to have even a 15 minute welcome message from the organizers with a little overview of the day.  Even something as simple as, “Thanks to persons X, Y, and Z for helping put this together…Sessions will start in 20 minutes and are all in rooms down this hallway…the bathrooms are on the other side of the conference center…lunch today is pizza and we would like to thank sponsor Q for providing it.”  It doesn’t need to be much, certainly not a full-blown Keynote like at the PASS Summit, but something to use as a rallying point to pull everyone together and get the day off to an official start would be nice.  Again, there may have been logistical reasons why that was not feasible here.  I’m just putting out my thoughts for other SQLSaturday coordinators to consider. The event overall was great.  I believe that there were over 300 in attendance, and everything seemed to run smoothly.  At least from an attendee’s point of view where there was plenty of muffins in the morning and pizza in the afternoon, with plenty of pop to drink.  And hey, if you’ve got the food and drink covered, a lot of other stuff could go wrong and people will be very forgiving.  But as I said, everything appeared to run pretty smoothly, at least until Buck Woody showed up in his Oracle shirt.  Other than that, the volunteers did a great job! I was a little surprised by how few people in my own backyard that I know.  It makes sense if you really think about it, given how many companies must be using SQL Server around here.  I guess I just got spoiled coming into the PASS Summit with a few contacts that I already knew would be there.  Perhaps I have been spending too much time with too few people at the Summits and I need to step out and meet more folks.  Of course, it also is different since the Summit is the big national event and a number of the folks I know are spread out across the country, so the Summit is the only time we’re all in the same place at the same time.  I did make a few new contacts at SQLSaturday, and bumped into a couple of people that I knew (and a couple others that I only knew from Twitter, and didn’t even realize that they were here in the area). Other than the sheer entertainment value of Buck Woody’s session, the one that was probably the greatest value for me was a quick introduction to PowerShell.  I have not done anything with it yet, but I think it will be a good tool to use to implement my plans for automated database recovery testing.  I saw just enough at the session to take away some of the intimidation factor, and I am getting ready to jump in and see what I can put together in the next few weeks.  And that right there made the investment worthwhile.  So I encourage you, if you have the opportunity to go to a SQLSaturday event near you, go for it!

    Read the article

  • .NET development on a Retina MacBook Pro with Windows 8

    - by Jeff
    I remember sitting in Building 5 at Microsoft with some of my coworkers, when one of them came in with a shiny new 11” MacBook Air. It was nearly two years ago, and we found it pretty odd that the OEM’s building Windows machines sucked at industrial design in a way that defied logic. While Dell and HP were in a race to the bottom building commodity crap, Apple was staying out of the low-end market completely, and focusing on better design. In the process, they managed to build machines people actually wanted, and maintain an insanely high margin in the process. I stopped buying the commodity crap and custom builds in 2006, when Apple went Intel. As a .NET guy, I was still in it for Microsoft’s stack of development tools, which I found awesome, but had back to back crappy laptops from HP and Dell. After that original 15” MacBook Pro, I also had a Mac Pro tower (that I sold after three years for $1,500!), a 27” iMac, and my favorite, a 17” MacBook Pro (the unibody style) with an SSD added from OWC. The 17” was a little much to carry around because it was heavy, but it sure was nice getting as much as eight hours of battery life, and the screen was amazing. When the rumors started about a 15” model with a “retina” screen inspired by the Air, I made up my mind I wanted one, and ordered it the day it came out. I sold my 17”, after three years, for $750 to a friend who is really enjoying it. I got the base model with the upgrade to 16 gigs of RAM. It feels solid for being so thin, and if you’ve used the third generation iPad or the newer iPhone, you’ll be just as thrilled with the screen resolution. I’m typically getting just over six hours of battery life while running a VM, but Parallels 8 allegedly makes some power improvements, so we’ll see what happens. (It was just released today.) The nice thing about VM’s are that you can run more than one at a time. Primarily I run the Windows 8 VM with four cores (the laptop is quad-core, but has 8 logical cores due to hyperthreading or whatever Intel calls it) and 8 gigs of RAM. I also have a Windows Server 2008 R2 VM I spin up when I need to test stuff in a “real” server environment, and I give it two cores and 4 gigs of RAM. The Windows 8 VM spins up in about 8 seconds. Visual Studio 2012 takes a few more seconds, but count part of that as the “ReSharper tax” as it does its startup magic. The real beauty, the thing I looked most forward to, is that beautifully crisp C# text. Consolas has never looked as good as it does at 10pt. as it does on this display. You know how it looks great at 80pt. when conference speakers demo stuff on a projector? Think that sharpness, only tiny. It’s just gorgeous. Beyond that, everything is just so responsive and fast. Builds of large projects happen in seconds, hundreds of unit tests run in seconds… you just don’t spend a lot of time waiting for stuff. It’s kind of painful to go back to my 27” iMac (which would be better if I put an SSD in it before its third birthday). Are there negatives? A few minor issues, yes. As is the case with OS X, not everything scales right. You’ll see some weirdness at times with splash screens and icons and such. Chrome’s text rendering (in Windows) is apparently not aware of how to deal with higher DPI’s, so text is fuzzy (the OS X version is super sharp, however). You’ll also have to do some fiddling with keyboard settings to use the Windows 8 keyboard shortcuts. Overall, it’s as close to a no-compromise development experience as I’ve ever had. I’m not even going to bother with Boot Camp because the VM route already exceeds my expectations. You definitely get what you pay for. If this one also lasts three years and I can turn around and sell it, it’s worth it for something I use every day.

    Read the article

  • What Can We Learn About Software Security by Going to the Gym

    - by Nick Harrison
    There was a recent rash of car break-ins at the gym. Not an epidemic by any stretch, probably 4 or 5, but still... My gym used to allow you to hang your keys from a peg board at the front desk. This way you could come to the gym dressed to work out, lock your valuables in your car, and not have anything to worry about. Ignorance is bliss. The problem was that anyone who wanted to could go pick up your car keys, click the unlock button and find your car. Once there, they could rummage through your stuff and then walk back in and finish their workout as if nothing had happened. The people doing this were a little smatter then the average thief and would swipe some but not all of your cash leaving everything else in place. Most thieves would steal the whole car and be busted more quickly. The victims were unaware that anything had happened for several days. Fortunately, once the victims realized what had happened, the gym was still able to pull security tapes and find out who was misbehaving. All of the bad guys were busted, and everyone can now breathe a sigh of relieve. It is once again safe to go to the gym. Except there was still a fundamental problem. Putting your keys on a peg board by the front door is just asking for bad things to happen. One person got busted exploiting this security flaw. Others can still be exploiting it. In fact, others may well have been exploiting it and simply never got caught. How long would it take you to realize that $10 was missing from your wallet, if everything else was there? How would you even know when it went missing? Would you go to the front desk and even bother to ask them to review security tapes if you were only missing a small amount. Once highlighted, it is easy to see how commonly such vulnerability may have been exploited. So the gym did the very reasonable precaution of removing the peg board. To me the most shocking part of this story is the resulting uproar from gym members losing the convenient key peg. How dare they remove the trusted peg board? How can I work out now, I have to carry my keys from machine to machine? How can I enjoy my workout with this added inconvenience? This all happened a couple of weeks ago, and some people are still complaining. In light of the recent high profile hacking, there are a couple of parallels that can be drawn. Many web sites are riddled with vulnerabilities are crazy and easily exploitable as leaving your car keys by the front door while you work out. No one ever considered thanking the people who were swiping these keys for pointing out the vulnerability. Without a hesitation, they had their gym memberships revoked and are awaiting prosecution. The gym did recognize the vulnerability for what it is, and closed up that attack vector. What can we learn from this? Monitoring and logging will not prevent a crime but they will allow us to identify that a crime took place and may help track down who did it. Once we find a security weakness, we need to eliminate it. We may never identify and eliminate all security weaknesses, but we cannot allow well known vulnerabilities to persist in our system. In our case, we are not likely to meet resistance from end users. We are more likely to meet resistance from stake holders, product owners, keeper of schedules and budgets. We may meet resistance from integration partners, co workers, and third party vendors. Regardless of the source, we will see resistance, but the weakness needs to be dealt with. There is no need to glorify a cracker for bringing to light a security weakness. Regardless of their claimed motives, they are not heroes. There is also no point in wasting time defending weaknesses once they are identified. Deal with the weakness and move on. In may be embarrassing to find security weaknesses in our systems, but it is even more embarrassing to continue ignoring them. Even if it is unpopular, we need to seek out security weaknesses and eliminate them when we find them. http://www.sans.org has put together the Common Weakness Enumeration http://cwe.mitre.org/ which lists out common weaknesses. The site navigation takes a little getting used to, but there is a treasure trove here. Here is the detail page for SQL Injection. It clearly states how this can be exploited, in case anyone doubts that the weakness should be taken seriously, and more importantly how to mitigate the risk.

    Read the article

  • How can we implement change notification propagation for WPF and SL in the MVVM pattern?

    - by Firoso
    Here's an example scenario targetting MVVM WPF/SL development: View data binds to view model Property "Target" "Target" exposes a field of an object called "data" that exists in the local application model, called "Original" when "Original" changes, it should raise notification to the view model and then propogate that change notification to the View. Here are the solutions I've come up with, but I don't like any of them all that much. I'm looking for other ideas, by the time we come up with something rock solid I'm certain Microsoft will have released .NET 5 with WPF/SL extensions for better tools for MVVM development. For now the question is, "What have you done to solve this problem and how has it worked out for you?" Option 1. Proposal: Attach a handler to data's PropertyChanged event that watches for string values of properties it cares about that might have changed, and raises the appropriate notification. Why I don't like it: Changes don't bubble naturally, objects must be explicitly watched, if data changes to a new source, events must be un-registered/registered. Why I kind of like it: I get explicit control over propogation of changes, and I don't have to use any types that belong at a higher level of the application such as dependancy properties. Option 2. Proposal: Attach a handler to data's PropertyChanged event that re-raises the event across all properties using the name property name. Why I don't like it: This is essentially the same as option 1, but less intelligent, and forces me to never change my property names, as they have to be the same as the property names on data Why I kind of like it: It's very easy to set up and I don't have to think about it... Then again if I try to think, and change names to things that make sense, I shoot myself in the foot, and then I have to think about it! Option 3. Proposal: Inherit my view model from dependancy object, and notify binding sources of changes directly. Why I don't like it: I'm not even 100% sure dependancy properties/objects can DO this, it was just a thought to look into. Also I don't personally feel that WPF/SL types like Dep Obj belong at the view model level. Why I kind of like it: IF it has the capability that I'm seeking then it's a good answer! minus that pesky layering issue. Option 4. Proposal: Use a consistant agent messaging system based off of Task Parallels DataFlow Library to propogate everything through linked pipelining. Why I don't like it: I've never tried this, and somehow I think it will be lacking, plus it requires me to think about my code completely differently all the way around. Why I kind of like it: It has the possiblity of allowing me to do some VERY fun manipulations with a push based data model and using ActionBlocks as validation AND setters to then privately change view model properties and explicitly control PropertyChanged notifications.

    Read the article

  • Using Apple’s Bonjour service from .NET?

    - by Martín Marconcini
    I have an iPhone app that publishes through Bonjour. The Mac counterpart works, they sync and exchange data. Now I have to port that little Mac app to Windows. I’ve decided to go with .NET (because that’s what I know). The app is not complex, but I’m in the early stages. I need to browse/discover Bonjour services. For this task, I’ve downloaded Mono.Zeroconf and Apple’s latest SDK (which includes a couple of C# Samples). I’m not really pasting code because I’m really copy/pasting the samples. In fact, Mono.Zeroconf has a MZClient.exe that can be used to test “all the API”. My 1st test was -on the same box- open two cmd.exe and launch a MZclient registering a service and on the other, launch it and “discover it”. It doesn’t work. Here’s the server: C:\MZ>MZClient -v -p "_http._tcp 80 mysimpleweb” *** Registering name = 'mysimpleweb', type = '_http._tcp', domain = 'local.' *** Registered name = ‘mysimpleweb’ On the other terminal: c:\MZ>MZClient -v -t "_http._tcp" Creating a ServiceBrowser with the following settings: Interface = 0 (All) Address Protocol = Any Domain = local Registration Type = _http._tcp Resolve Shares = False Hit ^C when you're bored waiting for responses. And that’s it. Nothing happens. I’ve of course tried with different services to no avail. Even played a little bit with that domain thing. Remember this is the same box. I tried on another computer, because this was a VM inside OSX, so I went ahead and tried on a “pure” win XP. Nothing. note: I have Apple Bonjour Service (up and running) and also the Apple SDK (installed later). Given that this didn’t work, I went ahead and decided to try the Apple SDK which has an Interop and a few pre-compiled samples (and its source code). Short story, neither the mDSNBrowser.exe nor the SimpleChat.exe work/see/discover anything. My box is a Win7 under Parallels, but that doesn’t seem to be affecting anything, given that the native XP exhibits the same problems. What am I doing so awfully wrong?

    Read the article

  • I Clobbered a Leopard with a Window Last Night

    - by D'Arcy Lussier
    I’ve had my 15” Mac Book Pro for a little over a year now, and its hands-down the best laptop I’ve ever owned…hardware wise. And I tried, I really really tried, to like OSX. I even bought Parallels so I could run Windows 7 and all my development tools while still trying to live in an OSX world. But in the end, I missed Windows too much. There were just too many shortcomings with OSX that kept me from being productive. For one thing, Office for Mac is *not* Office for Windows. The applications are written by different teams, and Excel on the Mac is just different enough to be painful. The VM experience was adequate, but my MBP would heat up like crazy when running it and the experience trying to get Windows apps to interact with an OSX file system was awkward. And I found I was in the VM more than I thought I’d be. iMovie is not as easy to use for doing simple movie editing as Windows Movie Maker. There’s no free blog editing software for OSX that’s on par with Windows Live Writer. And really, all I was using OSX for was Twitter (which I can use a Windows client for) and web browsing (also something Windows can provide obviously). So I had to ask myself – why am I forcing myself to use an operating system I don’t like, on a laptop that can support Windows 7? And so I paved my MBP and am happily running Windows 7 on it…and its fantastic! All the good stuff with the hardware is still there with the goodness of Win 7. Happy happy. I did run into some snags doing this though, and that’s really what this blog post is about – things to be aware of if you want to install Win 7 directly on your MBP metal. First, Ensure You Have Your Original Mac Install Disk This was a warning my buddy Dylan, who’s been running Win 7 on his MBP for a while now, gave me early on. The reason you need that original disk is that the hardware drivers you need are all located there. Apparently you can’t easily download them, so make sure you have them ahead of time. Second, Forget BootCamp The only reason you need BootCamp is if you still want the option to boot into OSX. If you don’t, then you don’t need BootCamp. In fact, you don’t even need BootCamp to install Win 7. What you *will* need though is a DVD with Win 7 burnt on it. Apple doesn’t support bootable USB drives. Well, actually they do for Mac Book Airs which don’t come with optical drives…but to get it working you’ll need to edit a system file of BootCamp so your make of MBP is included in an XML document, and even then you *still* are using BootCamp meaning you’ll be making an OSX partition. So don’t worry about BootCamp, just burn a Windows 7 disc, put it into the DVD drive, and restart your MBP. Third, Know The Secret Commands So after putting in the Windows 7 DVD and restarting your MBP, you’ll want to hold down the ‘C’ key during boot up. This tells the MBP that it should boot from the DVD drive instead of the hard drive. Interestingly, it appears you don’t have to do this if its the Mac OSX install disc (more on that in a second), but regardless – hold down C and Windows will start the install process. Next up is the partition process. You’ll notice that there’s a partition called ETI or something like that. This has to do with the drive format that Apple uses and how they partition their system drives. What I did – I blew it away! At first I didn’t, but I was told I couldn’t install Windows on the remaining space due to the different drive format. Blowing away the ETI partition (and all other partitions) allowed me to continue the Windows install. *REMEMBER –  No warranty is provided or implied, just telling you what I did and how I got it to work. Ok, so now Windows is installed and I’m rebooting. Everything looks good, but I need drivers! So I put in the OSX install DVD and run the BootCamp assistant which installs all the Windows drivers I need. Fantastic! Oh, I need to restart – no problem. OH NO, PROBLEM! I left the OSX install DVD in the drive and now the MBP wants to boot from the drive and install OSX! I’m not holding down the C key, what the heck?! Ok, well there must be a way to eject this disk…hmm…no physical button on the side…the eject button doesn’t seem to work on the keyboard…no little pin hole to insert something to force the disc out…well what the…?! It turns out, if you want to eject a disc at boot up, you need (and I kid you not) to plug a mouse into the laptop and hold down the right-click button while its booting. This ejected the disc for me. Seriously. Finally, Things You Should Be Aware Of Once you have Windows up and running there’s a few things you need to be aware of, mainly new keyboard shortcuts. For instance, on the Mac keyboard there is no Home, End, PageUp or PageDown. There’s also no obvious way to do something like select large amounts of text (like you would by holding Shift-Home at the end of a line of text for instance). So here’s some shortcuts you need to know: Home – fn + left arrow End – fn + right arrow Select a line of text as you would with the Home key – Shift + fn + left arrow Select a line of text as you would with the End key – Shift + fn + right arrow Page Up – fn + up arrow Page Down – fn + down arrow Also, you’ll notice that the awesome Mac track pad doesn’t respond to taps as clicks. No fear, this is just a setting that needs to be altered in the BootCamp control panel (that controls the Mac Hardware-specific settings within Windows, you can access it easily from the system tray icon) One other thing, battery life seems a bit lower than with OSX, but then again I’m also doing more than Twitter or web browsing on this thing now. Conclusion My laptop runs awesome now that I have Windows 7 on there. It’s obviously up to individual taste, but for me I just didn’t see benefits to living in an OSX world when everything I needed lived in Windows. And also, I finally am back to an operating system that doesn’t require me to eject a USB drive before physically removing it! It’s 2012 folks, how has this not been fixed?! D

    Read the article

  • Guest Post: Using IronRuby and .NET to produce the &lsquo;Hello World of WPF&rsquo;

    - by Eric Nelson
    [You might want to also read other GuestPosts on my blog – or contribute one?] On the 26th and 27th of March (2010) myself and Edd Morgan of Microsoft will be popping along to the Scottish Ruby Conference. I dabble with Ruby and I am a huge fan whilst Edd is a “proper Ruby developer”. Hence I asked Edd if he was interested in creating a guest post or two for my blog on IronRuby. This is the second of those posts. If you should stumble across this post and happen to be attending the Scottish Ruby Conference, then please do keep a look out for myself and Edd. We would both love to chat about all things Ruby and IronRuby. And… we should have (if Amazon is kind) a few books on IronRuby with us at the conference which will need to find a good home. This is me and Edd and … the book: Order on Amazon: http://bit.ly/ironrubyunleashed Using IronRuby and .NET to produce the ‘Hello World of WPF’ In my previous post I introduced, to a minor extent, IronRuby. I expanded a little on the basics of by getting a Rails app up-and-running on this .NET implementation of the Ruby language — but there wasn't much to it! So now I would like to go from simply running a pre-existing project under IronRuby to developing a whole new application demonstrating the seamless interoperability between IronRuby and .NET. In particular, we'll be using WPF (Windows Presentation Foundation) — the component of the .NET Framework stack used to create rich media and graphical interfaces. Foundations of WPF To reiterate, WPF is the engine in the .NET Framework responsible for rendering rich user interfaces and other media. It's not the only collection of libraries in the framework with the power to do this — Windows Forms does the trick, too — but it is the most powerful and flexible. Put simply, WPF really excels when you need to employ eye candy. It's all about creating impact. Whether you're presenting a document, video, a data entry form, some kind of data visualisation (which I am most hopeful for, especially in terms of IronRuby - more on that later) or chaining all of the above with some flashy animations, you're likely to find that WPF gives you the most power when developing any of these for a Windows target. Let's demonstrate this with an example. I give you what I like to consider the 'hello, world' of WPF applications: the analogue clock. Today, over my lunch break, I created a WPF-based analogue clock using IronRuby... Any normal person would have just looked at their watch. - Twitter The Sample Application: Click here to see this sample in full on GitHub. Using Windows Presentation Foundation from IronRuby to create a Clock class Invoking the Clock class   Gives you The above is by no means perfect (it was a lunch break), but I think it does the job of illustrating IronRuby's interoperability with WPF using a familiar data visualisation. I'm sure you'll want to dissect the code yourself, but allow me to step through the important bits. (By the way, feel free to run this through ir first to see what actually happens). Now we're using IronRuby - unlike my previous post where we took pure Ruby code and ran it through ir, the IronRuby interpreter, to demonstrate compatibility. The main thing of note is the very distinct parallels between .NET namespaces and Ruby modules, .NET classes and Ruby classes. I guess there's not much to say about it other than at this point, you may as well be working with a purely Ruby graphics-drawing library. You're instantiating .NET objects, but you're doing it with the standard Ruby .new method you know from Ruby as Object#new — although, the root object of all your IronRuby objects isn't actually Object, it's System.Object. You're calling methods on these objects (and classes, for example in the call to System.Windows.Controls.Canvas.SetZIndex()) using the underscored, lowercase convention established for the Ruby language. The integration is so seamless. The fact that you're using a dynamic language on top of .NET's CLR is completely abstracted from you, allowing you to just build your software. A Brief Note on Events Events are a big part of developing client applications in .NET as well as under every other environment I can think of. In case you aren't aware, event-driven programming is essentially the practice of telling your code to call a particular method, or other chunk of code (a delegate) when something happens at an unpredictable time. You can never predict when a user is going to click a button, move their mouse or perform any other kind of input, so the advent of the GUI is what necessitated event-driven programming. This is where one of my favourite aspects of the Ruby language, blocks, can really help us. In traditional C#, for instance, you may subscribe to an event (assign a block of code to execute when an event occurs) in one of two ways: by passing a reference to a named method, or by providing an anonymous code block. You'd be right for seeing the parallel here with Ruby's concept of blocks, Procs and lambdas. As demonstrated at the very end of this rather basic script, we are using .NET's System.Timers.Timer to (attempt to) update the clock every second (I know it's probably not the best way of doing this, but for example's sake). Note: Diverting a little from what I said above, the ticking of a clock is very predictable, yet we still use the event our Timer throws to do this updating as one of many ways to perform that task outside of the main thread. You'll see that all that's needed to assign a block of code to be triggered on an event is to provide that block to the method of the name of the event as it is known to the CLR. This drawback to this is that it only allows the delegation of one code block to each event. You may use the add method to subscribe multiple handlers to that event - pushing that to the end of a queue. Like so: def tick puts "tick tock" end timer.elapsed.add method(:tick) timer.elapsed.add proc { puts "tick tock" } tick_handler = lambda { puts "tick tock" } timer.elapsed.add(tick_handler)   The ability to just provide a block of code as an event handler helps IronRuby towards that very important term I keep throwing around; low ceremony. Anonymous methods are, of course, available in other more conventional .NET languages such as C# and VB but, as usual, feel ever so much more elegant and natural in IronRuby. Note: Whether it's a named method or an anonymous chunk o' code, the block you delegate to the handling of an event can take arguments - commonly, a sender object and some args. Another Brief Note on Verbosity Personally, I don't mind verbose chaining of references in my code as long as it doesn't interfere with performance - as evidenced in the example above. While I love clean code, there's a certain feeling of safety that comes with the terse explicitness of long-winded addressing and the describing of objects as opposed to ambiguity (not unlike this sentence). However, when working with IronRuby, even I grow tired of typing System::Whatever::Something. Some people enjoy simply assuming namespaces and forgetting about them, regardless of the language they're using. Don't worry, IronRuby has you covered. It is completely possible to, with a call to include, bring the contents of a .NET-converted module into context of your IronRuby code - just as you would if you wanted to bring in an 'organic' Ruby module. To refactor the style of the above example, I could place the following at the top of my Clock class: class Clock include System::Windows::Shape include System::Windows::Media include System::Windows::Threading # and so on...   And by doing so, reduce calls to System::Windows::Shapes::Ellipse.new to simply Ellipse.new or references to System::Windows::Threading::DispatcherPriority.Render to a friendlier DispatcherPriority.Render. Conclusion I hope by now you can understand better how IronRuby interoperates with .NET and how you can harness the power of the .NET framework with the dynamic nature and elegant idioms of the Ruby language. The manner and parlance of Ruby that makes it a joy to work with sets of data is, of course, present in IronRuby — couple that with WPF's capability to produce great graphics quickly and easily, and I hope you can visualise the possibilities of data visualisation using these two things. Using IronRuby and WPF together to create visual representations of data and infographics is very exciting to me. Although today, with this project, we're only presenting one simple piece of information - the time - the potential is much grander. My day-to-day job is centred around software development and UI design, specifically in the realm of healthcare, and if you were to pay a visit to our office you would behold, directly above my desk, a large plasma TV with a constantly rotating, animated slideshow of charts and infographics to help members of our team do their jobs. It's an app powered by WPF which never fails to spark some conversation with visitors whose gaze has been hooked. If only it was written in IronRuby, the pleasantly low ceremony and reduced pre-processing time for my brain would have helped greatly. Edd Morgan blog Related Links: Getting PhP and Ruby working on Windows Azure and SQL Azure

    Read the article

< Previous Page | 5 6 7 8 9 10  | Next Page >