Search Results

Search found 17619 results on 705 pages for 'build guides'.

Page 300/705 | < Previous Page | 296 297 298 299 300 301 302 303 304 305 306 307  | Next Page >

  • How would you explain that software engineering is more specialized than other engineering fields?

    - by Spencer K
    I work with someone who insists that any good software engineer can develop in any software technology, and experience in a particular technology doesn't matter to building good software. His analogy was that you don't have to have knowledge of the product being built to know how to build an assembly line that manufactures said product. In a way it's a compliment to be viewed with an eye such that "if you're good, you're good at everything", but in a way it also trivializes the profession, as in "Codemonkey, go sling code". Without experience in certain software frameworks, you can get in trouble fast, and that's important. I tried explaining this, but he didn't buy it. Any different views or thoughts on this to help explain that my experience in one thing, doesn't translate to all things?

    Read the article

  • Open source level editor for HTML5 platform game?

    - by Lai Yu-Hsuan
    A natty GUI editor is very helpful to create level map. I want to use some open-source choices rather than build my own from scratch. I found Tiled Map Editor but it doesn't work for what I want. Though I'm building HTML5 game, I don't have to use a HTML5 level editor as long as it can output well-formatted map files which my javascript can read. Edit: Sorry for the confusion. Tiled does not work for me because to make the player perform a 'tricky' jump, sometimes I want to set the distance between two platforms to, say, 7/3 or 8/3 tiles. But in Tiled I get only 2 or 3. If Tiled can do this, please teach me.

    Read the article

  • Ubuntu 13.04 installation issues: unable to handle kernel paging request error

    - by user173944
    I wish I could say that I’ve done more for the Linux community as of recent but I am very VERY new to all of this and I feel very much in over my head. I figured I would install Ubuntu. on my computer and then I would learn and contribute to the community simultaneously. I will try to be as detailed as I can, please ask questions if you need clarification. I installed Ubuntu. 13.04 (64-bit) on my dell Inspiron 1501. This has an AMD Turion 64-bit TL-56 1.8 Ghz mobile processor. It is a dual core. It has an ATI Radeon Xpress 1150 chipset in it as well. As of right now I only have a total of 2Ghz ram, however I was planning on upgrading that in the near future so I opted for the 64-bit Ubuntu. 13.04. I first tried the live CD and everything seemed to be functioning correctly other than the wireless (but that's not the issue at hand, there are plenty of guides on the internet on how to get that functioning). The internet worked just fine when it was plugged in so no issues there. However, once I went from that to installing 13.04 (just 13.04, no dual partitioning... I want this computer to run strictly Ubuntu.) it did not work. It took me into a shell that I could not type anything into. In this shell it said Bug: unable to handle kernel paging and then it called a bunch of traces and froze up. I had to hard reset the laptop. I tried the boot-repair program multiple times with many different settings and typically after starting up the laptop would say something along the lines of recursive errors. will attempt to fix and then it would attempt to fix a couple of things, and then the computer would freeze up after the text said end trace... so I had to hard reset it again. I'm not an impatient person either, when I say it would freeze up it would be for a period of at least 15 minutes each time before I decided to hard reset. I attempted to install 12.10 on it instead and I got the same exact message, and when I ran boot-repair it did the same exact thing as before. I am currently in the process of running memtest64+ on the computer's memory, though I really don't believe that, nor any of the hardware is at fault due to the fact that it was still running windows vista perfectly when I had decided to switch over to Ubuntu. so far the memtest has came back just fine without any errors, but I’ve only been running it for approximately an hour. So this is the situation I’m in. I did notice when I was using the live disk that the video driver needed updated so I performed that, though I’m fairly certain that has nothing to do with this. I have also attempted (though I’m not certain that my attempt was successful in accomplishing what I had planned) to manually edit the grub settings by making acpi=0 along top of adding nomodeset to the boot commands. Like I said, I’m not sure I did that correctly though, but I’m fairly certain I did. If anyone needs any more information I will be more than happy to provide it, I will post back once I get the full results of the memtest. I very much appreciate any ideas anyone else has, I’ve been at this for a few days to no avail... thank you

    Read the article

  • How to integrate game logic in game engines

    - by MahanGM
    Recently I'm working on a 2d game engine example in .Net with C#. My main problem is that I can't figure out how I should include the game logic within the game. Currently I have a base engine which is a set of classes that they are running sub-systems like Render, Sound, Input and Core functionality. There is an editor which helps the user to add resources, build levels, write scripts and other stuffs. I came up with an idea to use Reflection and CSharpCodeProvider from my editor to compile the written code. This way I can get an executable of my product too. This way is quite well but I would like to know what's really the solution and architecture to do this. My engine's role is 2d platform. The scripting language is C# right now because I can't consist any other embeddable language for now. The game needs compilation and CSharpCodeProvider is the only way for me to do it meantime.

    Read the article

  • Roll Your Own DIY Solar-Powered Security Camera Setup

    - by Jason Fitzpatrick
    If you’re looking to set up a security camera without running power or video lines, this solar-powered version combines a cheap Wi-Fi cam with a home-rolled solar setup to provide surveillance without wires. Courtesy of Reddit user CheapGuitar, the setup combines a dirt cheap off-brand Wi-Fi security camera, a Tupperware container spray painted black, some old camping solar panels, and a battery into a security camera that checks in as long as it’s in range of a Wi-Fi router or repeater. Hit up the link below to check out the build guide. Solar Powered Camera [via Hack A Day] HTG Explains: What Is Windows RT & What Does It Mean To Me? HTG Explains: How Windows 8′s Secure Boot Feature Works & What It Means for Linux Hack Your Kindle for Easy Font Customization

    Read the article

  • How to integrate the .gdf with a specific exe for Games Explorer

    - by Kraemer
    Hello, I want to create an installer for a game and after that an icon to be put in Games Explorer for Win Vista and Win 7. I have created the GDF (game definitions file), then build the script for project and obtained the .h, GDF and .rc files. But i can't compile using Visual Studio 2010 the .rc file into an executable to be used after that to create the installer. Some error is popping up after i set the executable path "Could not load file or assembly'Microsoft.VisualStudio.HpcDebugger.Impl, Version 10.0.0.0, Culture=neutral, PublickKeyToken=b03f5f7f11d50a3a' or one of its dependencies. The system cannot find the file specified." Any ideas what i'm doing wrong ? I need to mention that i've never worked before with GDF Editor and Visual Studio. Any answer would be highly appreciated.Thanks!

    Read the article

  • How to access shared folders from file editors?

    - by Marchosius
    I am running a xampp server on one of my other computers, and when using the brows functionality from text editors like gedit, bluefish and Eclipse/Aptana studio I can access the shares I made for the sites running on the server. Even when I right click-edit with-application with the the file browser it does not open the file in the editor. How can get this to work without having to copy the file over from the share then edit and then upload again after I am done? Similar issue when trying to upload a file to hosting server direct from the file server I have with filezilla. with this I also need to copy the file over to my pc and then from there upload to the hosting server with filezilla. EDIT: For eclipse I found a build in functionality to do this. (new-remote system connection-Enter the access details) Nothing yet on FileZilla Hope my Question is clear

    Read the article

  • How to use call web service action in SharePoint2013 workflow

    - by ybbest
    In SharePoint2013, you can use call web service action and loop. In this post, I will show you how to achieve this. 1. Create a List workflow called CallWebService 2. Create a variable called listurl and assign the value to http://sp2010/_vti_bin/listdata.svc 3. Create a dictionary variable called RequestHeaders and add the following key value pairs. 4. Call the web service with the HttpHeaders you just build in the previous step and store the response in the variable ResponseContent. 5. The ResponseContent variable is the Dynamic values (in SharePoint designer it will be called dictionary type) and it is new feature for SharePoint2013 workflow. We can use the following actions to count the number items in the variable. 6. You can use loop in SharePoint 2013 workflow and out each list title as shown below.

    Read the article

  • How do you balance documentation requirements with Agile developments

    - by Jeremy
    In our development group there is currently discussions around agile and waterfal methodology. No-one has any practical experience with agile, but we are doing some reading. The agile manifesto lists 4 values: Individuals and interactions over processes and tools Working software over comprehensive documentation Customer collaboration over contract negotiation Responding to change over following a plan We are an internal development group developing applications for the consumption of other units in our enterprise. A team of 10 developers builds and releases multiple projects simultanously, typically with 1 - maybe 2 (rarely) developer on each project. It seems to be that from a supportability perspective the organization needs to put some real value on documentation - as without it, there are serious risks with resourcing changes. With agile favouring interactions, and software deliverables over processes and documentation, how do you balance that with the requirements of supportable systems and maintaining knowledge and understanding of how those systems work? With a waterfall approach which favours documentation (requirements before design, design specs before construction) it is easy to build a process that meets some of the organizational requirements - how do we do this with an agile approach?

    Read the article

  • Visual Studio Extensions

    - by Scott Dorman
    Originally posted on: http://geekswithblogs.net/sdorman/archive/2013/10/18/visual-studio-extensions.aspxAs a product, Visual Studio has been around for a long time. In fact, it’s been 18 years since the first Visual Studio product was launched. In that time, there have been some major changes but perhaps the most important (or at least influential) changes for the course of the product have been in the last few years. While we can argue over what was and wasn’t an important change or what has and hasn’t changed, I want to talk about what I think is the single most important change Microsoft has made to Visual Studio. Specifically, I’m referring to the Visual Studio Gallery (first introduced in Visual Studio 2010) and the ability for third-parties to easily write extensions which can add new functionality to Visual Studio or even change existing functionality. I know Visual Studio had this ability before the Gallery existed, but it was expensive (both from a financial and development resource) perspective for a company or individual to write such an extension. The Visual Studio Gallery changed all of that. As of today, there are over 4000 items in the Gallery. Microsoft itself has over 100 items in the Gallery and more are added all of the time. Why is this such an important feature? Simply put, it allows third-parties (companies such as JetBrains, Telerik, Red Gate, Devart, and DevExpress, just to name a few) to provide enhanced developer productivity experiences directly within the product by providing new functionality or changing existing functionality. However, there is an even more important function that it serves. It also allows Microsoft to do the same. By providing extensions which add new functionality or change existing functionality, Microsoft is not only able to rapidly innovate on new features and changes but to also get those changes into the hands of developers world-wide for feedback. The end result is that these extensions become very robust and often end up becoming part of a later product release. An excellent example of this is the new CodeLens feature of Visual Studio 2013. This is, perhaps, the single most important developer productivity enhancement released in the last decade and already has huge potential. As you can see, out of the box CodeLens supports showing you information about references, unit tests and TFS history.   Fortunately, CodeLens is also accessible to Visual Studio extensions, and Microsoft DevLabs has already written such an extension to show code “health.” This extension shows different code metrics to help make sure your code is maintainable. At this point, you may have already asked yourself, “With over 4000 extensions, how do I find ones that are good?” That’s a really good question. Fortunately, the Visual Studio Gallery has a ratings system in place, which definitely helps but that’s still a lot of extensions to look through. To that end, here is my personal list of favorite extensions. This is something I started back when Visual Studio 2010 was first released, but so much has changed since then that I thought it would be good to provide an updated list for Visual Studio 2013. These are extensions that I have installed and use on a regular basis as a developer that I find indispensible. This list is in no particular order. NuGet Package Manager for Visual Studio 2013 Microsoft CodeLens Code Health Indicator Visual Studio Spell Checker Indent Guides Web Essentials 2013 VSCommands for Visual Studio 2013 Productivity Power Tools (right now this is only for Visual Studio 2012, but it should be updated to support Visual Studio 2013.) Everyone has their own set of favorites, so mine is probably not going to match yours. If there is an extension that you really like, feel free to leave me a comment!

    Read the article

  • HG: fork web app project to separate API code from app code

    - by cs_brandt
    I have a web app thats been in active development for about 8 months now and its becoming apparent that the project has a need to maintain a separation between app specific code and our OO Javascript API. What I would like to do is have another repository with the following general structure of the js API code. repo_name | +---build | +---build_tools | +---doc | +---src | +---js Of course this structure is different from the original web app directory structure. If I make changes to this new repository how could I pull in those changes to the web app repository without unintentionally removing files or modifying the directory structure of the web app repository?

    Read the article

  • How to create an Access database by using ADOX and Visual C# .NET

    - by SAMIR BHOGAYTA
    Build an Access Database 1. Open a new Visual C# .NET console application. 2. In Solution Explorer, right-click the References node and select Add Reference. 3. On the COM tab, select Microsoft ADO Ext. 2.7 for DDL and Security, click Select to add it to the Selected Components, and then click OK. 4. Delete all of the code from the code window for Class1.cs. 5. Paste the following code into the code window: using System; using ADOX; private void btnCreate_Click(object sender, EventArgs e) { ADOX.CatalogClass cat = new ADOX.CatalogClass(); cat.Create("Provider=Microsoft.Jet.OLEDB.4.0;" +"Data Source=D:\\NewMDB.mdb;" +"Jet OLEDB:Engine Type=5"); MessageBox.Show("Database Created Successfully"); cat = null; }

    Read the article

  • The Batcave in LEGO

    - by Jason Fitzpatrick
    There seems to be something of an arms race afoot among hardcore LEGO enthusiasts, but given the awesome fruits of their labor we’re not about to attempt an intervention. This amazing diorama, complete with functioning lighting, is a 20,000 piece tribute to the Batcave. Courtesy of builders Wayne Hussey and Carlyle Livingston, we’re treated to a Batcave rendition in LEGO that’s so detailed the close-up shots feel like you can step right into them. Hit up the link below to check out more detailed photos and videos of the build. LEGO Batcave [via Make] HTG Explains: What is the Windows Page File and Should You Disable It? How To Get a Better Wireless Signal and Reduce Wireless Network Interference How To Troubleshoot Internet Connection Problems

    Read the article

  • Product Support Webcast for Existing Customers: Oracle Webcenter Portal 11g User & Administration Tips

    - by John Klinke
    Register for our upcoming Advisor Webcast 'Oracle WebCenter Portal 11g: User & Administration Tips' scheduled for November 12, 2013 at 11:00 am, Eastern Standard Time (8:00 am Pacific Standard Time, 4:00 pm GMT Time, 5:00 pm Europe Time). This 1-hour session is recommended for technical and functional users who use Oracle WebCenter Portal to build company portals using run-time tools.Topics will include:• Whats new in 11.1.1.8 of WebCenter Portal• Terminology Changes• Using the Portal once its built• Setting up Self Registration (Admins)• End User Experience• Development Environment• Patching InformationFor more information and to register for this Advisor Webcast, please see Oracle WebCenter Portal 11g: User & Administration Tips (Doc ID 1585902.1).

    Read the article

  • Oracle VM RAC template - what it took

    - by wcoekaer
    In my previous posting I introduced the latest Oracle Real Application Cluster / Oracle VM template. I mentioned how easy it is to deploy a complete Oracle RAC cluster with Oracle VM. In fact, you don't need any prior knowledge at all to get a complete production-ready setup going. Here is an example... I built a 4 node RAC cluster, completely configured in just over 40 minutes - starting from import template into Oracle VM, create VMs to fully up and running Oracle RAC. And what was needed? 1 textfile with some hostnames and ip addresses and deploycluster.py. The setup is a 4 node cluster where each VM has 8GB of RAM and 4 vCPUs. The shared ASM storage in this case is 100GB, 5 x 20GB volumes. The VM names are racovm.0-racovm.3. The deploycluster script starts the VMs, verifies the configuration and sends the database cluster configuration info through Oracle VM Manager to the 4 node VMs. Once the VMs are up and running, the first VM starts the actual Oracle RAC setup inside and talks to the 3 other VMs. I did not log into any VM until after everything was completed. In fact, I connected to the database remotely before logging in at all. # ./deploycluster.py -u admin -H localhost --vms racovm.0,racovm.1,racovm.2,racovm.3 --netconfig ./netconfig.ini Oracle RAC OneCommand (v1.1.0) for Oracle VM - deploy cluster - (c) 2011-2012 Oracle Corporation (com: 26700:v1.1.0, lib: 126247:v1.1.0, var: 1100:v1.1.0) - v2.4.3 - wopr8.wimmekes.net (x86_64) Invoked as root at Sat Jun 2 17:31:29 2012 (size: 37500, mtime: Wed May 16 00:13:19 2012) Using: ./deploycluster.py -u admin -H localhost --vms racovm.0,racovm.1,racovm.2,racovm.3 --netconfig ./netconfig.ini INFO: Login password to Oracle VM Manager not supplied on command line or environment (DEPLOYCLUSTER_MGR_PASSWORD), prompting... Password: INFO: Attempting to connect to Oracle VM Manager... INFO: Oracle VM Client (3.1.1.305) protocol (1.8) CONNECTED (tcp) to Oracle VM Manager (3.1.1.336) protocol (1.8) IP (192.168.1.40) UUID (0004fb0000010000cbce8a3181569a3e) INFO: Inspecting /root/rac/deploycluster/netconfig.ini for number of nodes defined... INFO: Detected 4 nodes in: /root/rac/deploycluster/netconfig.ini INFO: Located a total of (4) VMs; 4 VMs with a simple name of: ['racovm.0', 'racovm.1', 'racovm.2', 'racovm.3'] INFO: Verifying all (4) VMs are in Running state INFO: VM with a simple name of "racovm.0" is in a Stopped state, attempting to start it...OK. INFO: VM with a simple name of "racovm.1" is in a Stopped state, attempting to start it...OK. INFO: VM with a simple name of "racovm.2" is in a Stopped state, attempting to start it...OK. INFO: VM with a simple name of "racovm.3" is in a Stopped state, attempting to start it...OK. INFO: Detected that all (4) VMs specified on command have (5) common shared disks between them (ASM_MIN_DISKS=5) INFO: The (4) VMs passed basic sanity checks and in Running state, sending cluster details as follows: netconfig.ini (Network setup): /root/rac/deploycluster/netconfig.ini buildcluster: yes INFO: Starting to send cluster details to all (4) VM(s)....... INFO: Sending to VM with a simple name of "racovm.0".... INFO: Sending to VM with a simple name of "racovm.1"..... INFO: Sending to VM with a simple name of "racovm.2"..... INFO: Sending to VM with a simple name of "racovm.3"...... INFO: Cluster details sent to (4) VMs... Check log (default location /u01/racovm/buildcluster.log) on build VM (racovm.0)... INFO: deploycluster.py completed successfully at 17:32:02 in 33.2 seconds (00m:33s) Logfile at: /root/rac/deploycluster/deploycluster2.log my netconfig.ini # Node specific information NODE1=db11rac1 NODE1VIP=db11rac1-vip NODE1PRIV=db11rac1-priv NODE1IP=192.168.1.56 NODE1VIPIP=192.168.1.65 NODE1PRIVIP=192.168.2.2 NODE2=db11rac2 NODE2VIP=db11rac2-vip NODE2PRIV=db11rac2-priv NODE2IP=192.168.1.58 NODE2VIPIP=192.168.1.66 NODE2PRIVIP=192.168.2.3 NODE3=db11rac3 NODE3VIP=db11rac3-vip NODE3PRIV=db11rac3-priv NODE3IP=192.168.1.173 NODE3VIPIP=192.168.1.174 NODE3PRIVIP=192.168.2.4 NODE4=db11rac4 NODE4VIP=db11rac4-vip NODE4PRIV=db11rac4-priv NODE4IP=192.168.1.175 NODE4VIPIP=192.168.1.176 NODE4PRIVIP=192.168.2.5 # Common data PUBADAP=eth0 PUBMASK=255.255.255.0 PUBGW=192.168.1.1 PRIVADAP=eth1 PRIVMASK=255.255.255.0 RACCLUSTERNAME=raccluster DOMAINNAME=wimmekes.net DNSIP= # Device used to transfer network information to second node # in interview mode NETCONFIG_DEV=/dev/xvdc # 11gR2 specific data SCANNAME=db11vip SCANIP=192.168.1.57 last few lines of the in-VM log file : 2012-06-02 14:01:40:[clusterstate:Time :db11rac1] Completed successfully in 2 seconds (0h:00m:02s) 2012-06-02 14:01:40:[buildcluster:Done :db11rac1] Build 11gR2 RAC Cluster 2012-06-02 14:01:40:[buildcluster:Time :db11rac1] Completed successfully in 1779 seconds (0h:29m:39s) From start_vm to completely configured : 29m:39s. The other 10m was the import template and create 4 VMs from template along with the shared storage configuration. This consists of a complete Oracle 11gR2 RAC database with ASM, CRS and the RDBMS up and running on all 4 nodes. Simply connect and use. Production ready. Oracle on Oracle.

    Read the article

  • How does the GPL work in regards to languages like Dart which compile to other languages?

    - by Peter-W
    Google's Dart language is not supported by any Web Browsers other than a special build of Chromium known as Dartium. To use Dart for production code you need to run it through a Dart-JavaScript compiler/translator and then use the outputted JavaScript in your web application. Because JavaScript is an interpreted language everyone who receives the "binary"(Aka, the .js file) has also received the source code. Now, the GNU General Public License v3.0 states that: "The “source code” for a work means the preferred form of the work for making modifications to it." Which would imply that the original Dart code in addition to the JavaScript code must also be provided to the end user. Does this mean that any web applications written in Dart must also provide the original Dart code to all visitors of their website even though a copy of the source code has already been provided in a human readable/writable/modifiable form?

    Read the article

  • Hold The Date: GlassFish Community Event and Party @ JavaOne 2012 - Sep 30

    - by arungupta
    A yearly tradition for the past 5 years (2007, 2008, 2009, 2010, 2011) is back again this year ... GlassFish Community Event is a gathering of GlassFish community members attending JavaOne. GlassFish Party is for everybody who are, or would like to be friends of GlassFish, in and around the San Francisco Bay Area. This year, again, both the events will be happening on the Sunday of JavaOne. The exact coordinates are a TBD but save the date while you are booking flights/hotels. GlassFish Community Event When: Sep 30, 11am - 1pm Where: TBD GlassFish Party When: Sep 30, 8pm - 10pm Where: The Thirsty Bear Note, a separate JavaOne registration is required to attend the community event. The party is open to everybody, and no JavaOne registration is required. RSVP details are still being worked upon and will be shared soon. 10 reasons to attend these events allows you to build your case with the management :-)

    Read the article

  • Google I/O 2010 - Building your own Google Wave provider

    Google I/O 2010 - Building your own Google Wave provider Google I/O 2010 - Open source Google Wave: Building your own wave provider Wave 101 Dan Peterson, Jochen Bekmann, JD Zamfirescu Pereira, David LaPalomento (Novell) Learn how to build your own wave service. Google is open sourcing the lion's share of the code that went into creating Google Wave to help bootstrap a network of federated providers. This talk will discuss the state of the reference implementation: the software architecture, how you can plug it into your own use cases -- and how you can contribute to the code and definition of the underlying specification. For all I/O 2010 sessions, please go to code.google.com From: GoogleDevelopers Views: 8 0 ratings Time: 59:03 More in Science & Technology

    Read the article

  • Declarative Architectures in Infrastructure as a Service (IaaS)

    - by BuckWoody
    I deal with computing architectures by first laying out requirements, and then laying in any constraints for it's success. Only then do I bring in computing elements to apply to the system. As an example, a requirement might be "world-side availability" and a constraint might be "with less than 80ms response time and full HA" or something similar. Then I can choose from the best fit of technologies which range from full-up on-premises computing to IaaS, PaaS or SaaS. I also deal in abstraction layers - on-premises systems are fully under your control, in IaaS the hardware is abstracted (but not the OS, scale, runtimes and so on), in PaaS the hardware and the OS is abstracted and you focus on code and data only, and in SaaS everything is abstracted - you merely purchase the function you want (like an e-mail server or some such) and simply use it. When you think about solutions this way, the architecture moves to the primary factor in your decision. It's problem-first architecting, and then laying in whatever technology or vendor best fixes the problem. To that end, most architects design a solution using a graphical tool (I use Visio) and then creating documents that  let the rest of the team (and business) know what is required. It's the template, or recipe, for the solution. This is extremely easy to do for SaaS - you merely point out what the needs are, research the vendor and present the findings (and bill) to the business. IT might not even be involved there. In PaaS it's not much more complicated - you use the same Application Lifecycle Management and design tools you always have for code, such as Visual Studio or some other process and toolset, and you can "stamp out" the application in multiple locations, update it and so on. IaaS is another story. Here you have multiple machines, operating systems, patches, virus scanning, run-times, scale-patterns and tools and much more that you have to deal with, since essentially it's just an in-house system being hosted by someone else. You can certainly automate builds of servers - we do this as technical professionals every day. From Windows to Linux, it's simple enough to create a "build script" that makes a system just like the one we made yesterday. What is more problematic is being able to tie those systems together in a coherent way (as a solution) and then stamp that out repeatedly, especially when you might want to deploy that solution on-premises, or in one cloud vendor or another. Lately I've been working with a company called RightScale that does exactly this. I'll point you to their site for more info, but the general idea is that you document out your intent for a set of servers, and it will deploy them to on-premises clouds, Windows Azure, and other cloud providers all from the same script. In other words, it doesn't contain the images or anything like that - it contains the scripts to build them on-premises or on a cloud vendor like Microsoft. Using a tool like this, you combine the steps of designing a system (all the way down to passwords and accounts if you wish) and then the document drives the distribution and implementation of that intent. As time goes on and more and more companies implement solutions on various providers (perhaps for HA and DR) then this becomes a compelling investigation. The RightScale information is here, if you want to investigate it further. Yes, there are other methods I've found, but most are tied to a single kind of cloud, and I'm not into vendor lock-in. Poppa Bear Level - Hands-on EvaluateRightScale at no cost.  Just bring your Windows Azurecredentials and follow the these tutorials: Sign Up for Windows Azure Add     Windows Azure to a RightScale Account Windows Azure Virtual Machines     3-tier Deployment Momma Bear Level - Just the Right level... ;0)  WindowsAzure Evaluation Guide - if you are new toWindows Azure Virtual Machines and new to RightScale, we recommend that youread the entire evaluation guide to gain a more complete understanding of theWindows Azure + RightScale solution.    WindowsAzure Support Page @ support.rightscale.com - FAQ's, tutorials,etc. for  Windows Azure Virtual Machines (Work in Progress) Baby Bear Level - Marketing WindowsAzure Page @ www.rightscale.com - find overview informationincluding solution briefs and presentation & demonstration videos   Scale     and Automate Applications on Windows Azure  Solution Brief     - how RightScale makes Windows Azure Virtual Machine even better SQL     Server on Windows Azure  Solution Brief   -       Run Highly Available SQL Server on Windows Azure Virtual Machines

    Read the article

  • How do you promote your blog or website?

    - by zcourts
    I tend to get (what I think are good ideas) and I go out and either build software/websites from scratch or use an existing software/tool such as wordpress. But when I'm done, and even though I get a few users that say they really like it, I can't seem to get my apps out there, or rather get a large set of eyes on it. So I'm interested in knowing how others do it. I read people's stories of how they did this amazing thing and within 2-3 months they're getting thousands or hundreds of thousands of users per month. It just seems to be all smoke and mirrors. So how have you done it? Or anyone you know who has... Does everyone throw lots of money into their promotion, something else?

    Read the article

  • Real-world SignalR example, ditching ghetto long polling

    - by Jeff
    One of the highlights of BUILD last week was the announcement that SignalR, a framework for real-time client to server (or cloud, if you will) communication, would be a real supported thing now with the weight of Microsoft behind it. Love the open source flava! If you aren’t familiar with SignalR, watch this BUILD session with PM Damian Edwards and dev David Fowler. Go ahead, I’ll wait. You’ll be in a happy place within the first ten minutes. If you skip to the end, you’ll see that they plan to ship this as a real first version by the end of the year. Insert slow clap here. Writing a few lines of code to move around a box from one browser to the next is a way cool demo, but how about something real-world? When learning new things, I find it difficult to be abstract, and I like real stuff. So I thought about what was in my tool box and the decided to port my crappy long-polling “there are new posts” feature of POP Forums to use SignalR. A few versions back, I added a feature where a button would light up while you were pecking out a reply if someone else made a post in the interim. It kind of saves you from that awkward moment where someone else posts some snark before you. While I was proud of the feature, I hated the implementation. When you clicked the reply button, it started polling an MVC URL asking if the last post you had matched the last one the server, and it did it every second and a half until you either replied or the server told you there was a new post, at which point it would display that button. The code was not glam: // in the reply setup PopForums.replyInterval = setInterval("PopForums.pollForNewPosts(" + topicID + ")", 1500); // called from the reply setup and the handler that fetches more posts PopForums.pollForNewPosts = function (topicID) { $.ajax({ url: PopForums.areaPath + "/Forum/IsLastPostInTopic/" + topicID, type: "GET", dataType: "text", data: "lastPostID=" + PopForums.currentTopicState.lastVisiblePost, success: function (result) { var lastPostLoaded = result.toLowerCase() == "true"; if (lastPostLoaded) { $("#MorePostsBeforeReplyButton").css("visibility", "hidden"); } else { $("#MorePostsBeforeReplyButton").css("visibility", "visible"); clearInterval(PopForums.replyInterval); } }, error: function () { } }); }; What’s going on here is the creation of an interval timer to keep calling the server and bugging it about new posts, and setting the visibility of a button appropriately. It looks like this if you’re monitoring requests in FireBug: Gross. The SignalR approach was to call a message broker when a reply was made, and have that broker call back to the listening clients, via a SingalR hub, to let them know about the new post. It seemed weird at first, but the server-side hub’s only method is to add the caller to a group, so new post notifications only go to callers viewing the topic where a new post was made. Beyond that, it’s important to remember that the hub is also the means to calling methods at the client end. Starting at the server side, here’s the hub: using Microsoft.AspNet.SignalR.Hubs; namespace PopForums.Messaging { public class Topics : Hub { public void ListenTo(int topicID) { Groups.Add(Context.ConnectionId, topicID.ToString()); } } } Have I mentioned how awesomely not complicated this is? The hub acts as the channel between the server and the client, and you’ll see how JavaScript calls the above method in a moment. Next, the broker class and its associated interface: using Microsoft.AspNet.SignalR; using Topic = PopForums.Models.Topic; namespace PopForums.Messaging { public interface IBroker { void NotifyNewPosts(Topic topic, int lasPostID); } public class Broker : IBroker { public void NotifyNewPosts(Topic topic, int lasPostID) { var context = GlobalHost.ConnectionManager.GetHubContext<Topics>(); context.Clients.Group(topic.TopicID.ToString()).notifyNewPosts(lasPostID); } } } The NotifyNewPosts method uses the static GlobalHost.ConnectionManager.GetHubContext<Topics>() method to get a reference to the hub, and then makes a call to clients in the group matched by the topic ID. It’s calling the notifyNewPosts method on the client. The TopicService class, which handles the reply data from the MVC controller, has an instance of the broker new’d up by dependency injection, so it took literally one line of code in the reply action method to get things moving. _broker.NotifyNewPosts(topic, post.PostID); The JavaScript side of things wasn’t much harder. When you click the reply button (or quote button), the reply window opens up and fires up a connection to the hub: var hub = $.connection.topics; hub.client.notifyNewPosts = function (lastPostID) { PopForums.setReplyMorePosts(lastPostID); }; $.connection.hub.start().done(function () { hub.server.listenTo(topicID); }); The important part to look at here is the creation of the notifyNewPosts function. That’s the method that is called from the server in the Broker class above. Conversely, once the connection is done, the script calls the listenTo method on the server, letting it know that this particular connection is listening for new posts on this specific topic ID. This whole experiment enables a lot of ideas that would make the forum more Facebook-like, letting you know when stuff is going on around you.

    Read the article

  • System Center Service Manager 2010 SP1 Resource Links

    - by Mickey Gousset
    System Center Service Manager is a new product that Microsoft released last year to handle incident/problem/change management.  Currently the latest version is System Center Service Manager SP1, and there is a Cumulative Update for SP1 that you should grab as well. A strong ecosystem is starting to spring up around this product, with tools and connectors that fill needs not build into the product.  To find the latest list of these items, you need to do to the SCSM 2010 Downloads page.  Here you can find a list of the latest tools and add-ons from Microsoft, as well as third-party vendors.  The Microsoft Exchange connector, and the Powershell Cmdlets are definitely worth it to download and install.

    Read the article

  • Claims-based Identity in .NET 4.5 and Windows 8

    - by Your DisplayName here!
    There was not a ton of new information about WIF and related technologies at Build, but Samuel Devasahayam did a great talk about claims-based access control that contained some very interesting bits of information with regards to future directions. From his slides: Windows 8 Bring existing identity claims model into the Windows platform Domain controller issues groups & claims Claims (user and device) sourced from identity attributes in AD Claims delivered in Kerberos PAC NT Token has a new claims section Enhanced SDDL API’s to work with claims Enhanced user mode CheckAccess API’s to work with claims New ACL-UX Target audits with claims-based expressions WIF & .NET 4.5 WIF is in the box with .NET Framework 4.5 Every principal in .NET 4.5 is a ClaimsPrincipal ADFS 2.1 ADFS 2.1 is available now as a in-box server role in Windows 8 Adds support for issuing device claims from Kerberos ticket

    Read the article

  • Row Count Plus Transformation

    As the name suggests we have taken the current Row Count Transform that is provided by Microsoft in the Integration Services toolbox and we have recreated the functionality and extended upon it. There are two things about the current version that we thought could do with cleaning up Lack of a custom UI You have to type the variable name yourself In the Row Count Plus Transformation we solve these issues for you. Another thing we thought was missing is the ability to calculate the time taken between components in the pipeline. An example usage would be that you want to know how many rows flowed between Component A and Component B and how long it took. Again we have solved this issue. Credit must go to Erik Veerman of Solid Quality Learning for the idea behind noting the duration. We were looking at one of his packages and saw that he was doing something very similar but he was using a Script Component as a transformation. Our philosophy is that if you have to write or Copy and Paste the same piece of code more than once then you should be thinking about a custom component and here it is. The Row Count Plus Transformation populates variables with the values returned from; Counting the rows that have flowed through the path Returning the time in seconds between when it first saw a row come down this path and when it saw the final row. It is possible to leave both these boxes blank and the component will still work.   All input columns are passed through the transformation unaltered, you are not permitted to change or add to the inputs or outputs of this component. Optionally you can set the component to fire an event, which happens during the PostExecute phase of the execution. This can be useful to improve visibility of this information, such that it is captured in package logging, or can be used to drive workflow in the case of an error event. Properties Property Data Type Description OutputRowCountVariable String The name of the variable into which the amount of row read will be passed (Optional). OutputDurationVariable String The name of the variable into which the duration in seconds will be passed. (Optional). EventType RowCountPlusTransform.EventType The type of event to fire during post execute, included in which are the row count and duration values. RowCountPlusTransform.EventType Enumeration Name Value Description None 0 Do not fire any event. Information 1 Fire an Information event. Warning 2 Fire a Warning event. Error 3 Fire an Error event. Installation The component is provided as an MSI file which you can download and run to install it. This simply places the files on disk in the correct locations and also installs the assemblies in the Global Assembly Cache as per Microsoft’s recommendations. You may need to restart the SQL Server Integration Services service, as this caches information about what components are installed, as well as restarting any open instances of Business Intelligence Development Studio (BIDS) / Visual Studio that you may be using to build your SSIS packages. For 2005/2008 Only - Finally you will have to add the transformation to the Visual Studio toolbox manually. Right-click the toolbox, and select Choose Items.... Select the SSIS Data Flow Items tab, and then check the Row Count Plus Transformation in the Choose Toolbox Items window. This process has been described in detail in the related FAQ entry for How do I install a task or transform component? We recommend you follow best practice and apply the current Microsoft SQL Server Service pack to your SQL Server servers and workstations, and this component requires a minimum of SQL Server 2005 Service Pack 1. Downloads The Row Number Transformation is available for SQL Server 2005, SQL Server 2008 (includes R2) and SQL Server 2012. Please choose the version to match your SQL Server version, or you can install multiple versions and use them side by side if you have more than one version of SQL Server installed. Row Count Plus Transformation for SQL Server 2005 Row Count Plus Transformation for SQL Server 2008 Row Count Plus Transformation for SQL Server 2012 Version History SQL Server 2012 Version 3.0.0.6 - SQL Server 2012 release. Includes upgrade support for both 2005 and 2008 packages to 2012. (5 Jun 2012) SQL Server 2008 Version 2.0.0.5 - SQL Server 2008 release. (15 Oct 2008) SQL Server 2005 Version 1.1.0.43 - Bug fix for duration. For long running processes the duration second count may have been incorrect. (8 Sep 2006) Version 1.1.0.42 - SP1 Compatibility Testing. Added the ability to raise an event with the count and duration data for easier logging or workflow. (18 Jun 2006) Version 1.0.0.1 - SQL Server 2005 RTM. Made available as general public release. (20 Mar 2006) Screenshot Troubleshooting Make sure you have downloaded the version that matches your version of SQL Server. We offer separate downloads for SQL Server 2005, SQL Server 2008 and SQL Server 2012. If you get an error when you try and use the component along the lines of The component could not be added to the Data Flow task. Please verify that this component is properly installed.  ... The data flow object "Konesans ..." is not installed correctly on this computer, this usually indicates that the internal cache of SSIS components needs to be updated. This is held by the SSIS service, so you need restart the the SQL Server Integration Services service. You can do this from the Services applet in Control Panel or Administrative Tools in Windows. You can also restart the computer if you prefer. You may also need to restart any current instances of Business Intelligence Development Studio (BIDS) / Visual Studio that you may be using to build your SSIS packages. Once installation is complete you need to manually add the task to the toolbox before you will see it and to be able add it to packages - How do I install a task or transform component?

    Read the article

  • Reaching the Pinnacle of Customer Experience : Customer Concepts WebTV #2

    - by Richard Lefebvre
    The challenge has never been greater – globalization increases consumer choice and quickly converts products into mere commodities. Leading companies understand that delivering exceptional customer experiences and building brand equity are vital for success. Please join us for an exclusive Web TV broadcast to hear how companies are enriching interactions differently with their customers to drive measurable business value. Our panel of experts including customers, industry thought leaders and Oracle executives will discuss how to refine the customer experience and build a digital experience to win new clients and maximise customer retention. Register now for this pan-European interactive Web TV show on Friday, July 6 at 10 a.m. BST / 11 a.m. CET. Watch and share the teaser video  REPLAY Fusion CRM Sales - Performance Management on demand: Watch and share the teaser video and replay.

    Read the article

< Previous Page | 296 297 298 299 300 301 302 303 304 305 306 307  | Next Page >