Search Results

Search found 29935 results on 1198 pages for 'open ldap'.

Page 530/1198 | < Previous Page | 526 527 528 529 530 531 532 533 534 535 536 537  | Next Page >

  • Attachment handling for web application with Jackrabbit

    - by Andrea Girardi
    I need to manage attachments on my Spring web application and I thought to use an open source repository. My app it's a job approval system using J2EE / SPRING 3 Framework and Postgress DB to allow user to tracks the job,right through every step of the approval process. It is a fully managed, collaborative system that operates from a central server and is accessed by a standard internet browser. An user should be able to add an attach to a request or an approval step, so, I though to use Jackrabbit with Postgres database persistence manager. I took a look to this post: http://onjava.com/pub/a/onjava/2006/10/04/what-is-java-content-repository.html?page=1 It's really interesting but, I've some question about this kind of solution :- I seen that Jackrabbit standalone as a Derby database embedded solution for persistence, is it enough for a professional use of the repository with more than 50 request / days (with attachment) ? Is there a reason for which I should use another database manager for persistence instead of the default one ?

    Read the article

  • Overheating on Dell Studio XPS 1645

    - by pjtatlow
    So I was wondering if anyone else has come upon this problem, and/or has come up with a solution. When I use my Ubuntu partition, my computer becomes extremely hot, and the fan runs very noisily for a very long time. If I reboot into windows while this is happening, my computer actually begins to cool down while doing the exact same tasks. Thinking this might just be a bug with Ubuntu, I installed fedora on another partition, and the same problem occurs. Is this a problem with the kernel? Cpufreq tells me that my CPU is running at 933 MHz out of a possible 1.6 GHz from my Intel Core i7 CPU Q70. For anyone who wants more information, I have 8 GB of memory, and an ATI Mobility Raedon HD 5730 Graphics Card. I'm open to any ideas anyone might have. Thanks in advance!

    Read the article

  • How to Convert DMG Files to ISO Files on Windows

    - by Taylor Gibb
    The DMG image format is by far the most popular file container format used to distribute software on Mac OS X. Here’s how to convert a DMG file into an ISO file that can be mounted on a Windows PC. First head over to this website and grab yourself a copy of dmg2img by clicking on the win32 binary link. Once the file has downloaded, open your Downloads folder, right click on the file, and select extract all from the context menu. The Best Free Portable Apps for Your Flash Drive Toolkit How to Own Your Own Website (Even If You Can’t Build One) Pt 3 How to Sync Your Media Across Your Entire House with XBMC

    Read the article

  • EISK&ndash;Employee Info Starter Kit 5.0

    - by Tiago Salgado
    Employee Info Starter Kit is an open source project that is highly influenced by the concept ‘Pareto Principle’ or 80-20 rule, where it is targeted to enable a web developer to gain 80% productivity with 20% of effort with respect to learning curve and production. It is intended to address different types of real world challenges faced by web application developers when performing common CRUD operations. Using a single database table ‘Employee’, the current release illustrates how to utilize Microsoft ASP.NET 4.0 Web Form Data Controls, Entity Framework 4.0 and Visual Studio 2010 effectively in that context.   More information on codeplex project site.

    Read the article

  • Brand New Oracle WebLogic 12c Online Launch Event, December 1, 10am PT

    - by B Shashikumar
    The brand new WebLogic 12c will be launched on December 1st with a 2-hour global webcast highlighting salient capabilities and benefits and featuring Hasan Rizvi, SVP, Oracle Fusion Middleware and Java. For the more techie types, the 2nd hour will be a developer focused discussion including multiple demos and live Q&A. Please join us, with your fellow IT managers, architects, and developers, to hear how the new release of Oracle WebLogic Server is: Designed to help you seamlessly move into the public or private cloud with an open, standards-based platform Built to drive higher value for your current infrastructure and significantly reduce development time and cost Enhanced with transformational platforms and technologies such as Java EE 6, Oracle’s Active GridLink for RAC, Oracle Traffic Director, and Oracle Virtual Assembly Builder   

    Read the article

  • How to convert .avi video to .mp4(for Motorola Milestone, Android 2.3.4) with Avidemux

    - by kv1dr
    I open .avi video with Avidemux and I set Video format to MPEG-4 AVC(under Configure, Bitrate tab I choose "Single Pass - Bitrate (Average)" and Target bitrate to 256 kb/s, under Filters I choose MPlayer resize to 480x360 and I also add a subtitles) audio format to AAC (Faac)(Under Configure, I choose Bitrate 96) and format to MP4(like a image below). When Avidemux convert video to .mp4 format I can play the file on my copmuter, but on my phone I can't. When I want to play it on my phone with native video player, it just show the error something like "Can't play this video". So the question is how to convert .avi video to .mp4 with Avidemux(because I want to have subtitles inside movie) to be playable with android phone(Android version 2.3.4) with native player. Any help will be highly appreciated. :)

    Read the article

  • How to make players be creative in a game, if the game cannot evaluate it?

    - by Mensonge
    I am working on a prototype game with several funny/visual effects that the player can trigger. The player can be quite creative in the way to use or combine these effects but it seems impossible to make detect/evaluate this creativity by the computer. So, from a game design perspective, I wonder what could be the features to drive the players to be creative (experiment various combinations). For the moment i think about "Draw something" where the result is evaluated by other players. I think about levels designed by "Little Big Planet" players but this aspect is out of the core game. I think also about "Minecraft" but I do not understand really how this game encourages the people to be creative (except of the open world). Please tell me if you have any ideas, articles or references that could help me coping with this problem.

    Read the article

  • Consoles in Ubuntu and automatic upgrade

    - by Muhammad Khan
    So I recently discovered that ubuntu is simultaneously running 6 consoles in addition to the GUI that everybody uses, which can be accessed by pressing Ctrl+Alt+F1 ... +F6 and then the default GUI with Ctrl+Alt+F7. What use can these consoles have when I can just open a terminal in the GUI mode? Also, why is having consoles like this advantageous for computer users; wouldn't a GUI be much more simplistic? Also, running the console told me that I was running a development version of Ubuntu Quantal which is version 12.10. The login screen (correctly?) says that I'm running 12.04 LTS. What does that mean? Thanks everybody!

    Read the article

  • Good Literature for "Object oriented programming in C"

    - by Dipan Mehta
    This is not a debate question about whether or not C is a good candidate for Object oriented programming or not. Quite often C is the primary platform where the development is happening. I have seen, and hopefully learnt through crawling many open source and commercial projects - that while the language inherently doesn't stop you if you create "non-object" code. However, you can still think in the "Object" way and reasonably write code that captures this designs thinking. For those who has done this, OO way is still the best way to write code even when you are programming in C. While, I have learnt most of it through the hard way, are there any deep literature that can help educate the relatively young guys about how to do OO programming in C?

    Read the article

  • How can I downgrade my version of Evolution to the one used in Ubuntu 11.04?

    - by Johnny
    I just upgraded, and like a few other users I had issues with Evolution email after the upgrade from 11.04 to 11.10 and then 12.04. I know to make backups, but in this case I stupidly didn't think that the program would be changed (Firefox wasn't modified at all), and so I failed to make a backup. Three days later I am still having issues, and recovering the emails is proving to be difficult with only partial recovery or it not working at all. My question is, can I add in some source to use the 11.04 version of Evolution, since that version was working fine and would know what to do with the current files (Inbox, Outbox, etc.) I also noticed that Evolution's restore feature said it changed the way emails are handled, so it seems like a downgrade could put everything back to normal. Worst case scenario, I start over, but I wanted to try everything first. Thanks in advance! I'm also open to any suggestions for restoring the old emails files to the current version of Evolution.

    Read the article

  • Difference between ~/folder and /home/username/folder when creating a path in /etc/environment

    - by r0xx4nne
    I had an executable script on my ubuntu located on ~/project/ directory and I tried to add that path to /etc/environment . So , I edit the path to this PATH="/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:~/project/" . Then , I logout and login back , open the terminal as su and run the command to execute my script on that folder but the result is command not found. Then, I change the path in /etc/environment to PATH="/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/home/r0xx4nne/project/" and voila it works.Now i can run the executable script inside ~/project/ without fail under su command. My question is , what's the difference between ~/project and /home/r0xx4nne/project when it comes in case of creating a path in /etc/environment ? Why it happened to be like this? I am a newbie and I just want to know more . Thanks for any reply .

    Read the article

  • Decrease filesize when resizing with mogrify

    - by plua
    I love the command line options of imagemagick. Mogrify is great to resize images and change quality, which is what I use most often. However, I have noted that the filesize if often larger than what it should be. Especially with small images. For instance, I have a regular 640px (width) photo, which I change to quality 80 and a width of 80px: mogrify -quality 80 -resize 80 file.jpg Works well and my image gets resized and the quality is changed to 80. However, the filesize is around 40Kb. For such a tiny image, that is huge! When I use mtPaint, and open the file and save it (not changing anything, just CTRL+O, CTRL+S), the filesize decreases with more than 95% to less than 2Kb! I have seen this is often the case. What goes wrong?

    Read the article

  • How to control an Ubuntu PC from another Ubuntu PC over Internet?

    - by Naveen
    There are two Ubuntu PCs called A and B. A and B are connected to the Internet using two separated Internet connections. (In my case, two mobile broadband connections ppp0 x2 ) Each connection has a unique & static public IP address. What I need is to control A computer's cursor, using B computer's mouse, over the Internet. In both computers, I have allowed other users to control my computer in Desktop Sharing preferences, as below: When I try to connect to A from the B using Remmina Remote Desktop Client, it refuses to connect after trying for a while. These are my settings: I expect this to be done from an available open source software, not from TeamViewer. I found this guide harder to understand. Please provide me clear instructions... Thanks for having a look!

    Read the article

  • Eye of gnome image bug with ATI graphic driver

    - by thonixx
    I just installed the ATI driver for my Ubuntu 11.10. After some annoying bugs and errors it works for now. But there is one most stupid bug. Whenever I open a picture in the default image viewer (eye of gnome EOG) it shows me an overexposed picture. Example with EOG: http://ubuntuone.com/4tJHSINBUPjypmcV2EXUF5 Example how it should be: http://ubuntuone.com/1DnwJ1pdQKUCloBcV1kcY5 How can I fix this? Update Driver I used was 8.911-111025a-128237C-ATI with Catalyst 11.11. I installed the driver via jockey and used the driver released with Ubuntu because the post-release driver fails everytime.

    Read the article

  • Why will network manager not allow me save my VPN settings?

    - by Solignis
    I am trying to configure am OpenVPN client on my laptop. I am running Ubuntu 11.10 64-bit. When I open network manager and import the VPN settings from the premade config folder everything takes. The problem is when I try to save the settings, the save button at the bottom of the network manager applet is greyed out. Further more when I hover over the button it says Authenticate to save this connection for all users of this machine. The problem is I did not check the box Available to all users it was already checked and it is also greyed out and won't let me manipulate it. What is going on? Is this a bug or is there something I am missing? Any help would be wonderful.

    Read the article

  • Good alternative to NetLimiter?

    - by Harsh
    There is a program NetLimiter for windows. While I was using Windows it was very useful for me to find out the IP address of the person who was downloading from me, or to know IP address of any person on LAN who was using DC++ with some nick. And after that I can easily know the computer name of that person using nbtstat. I was wondering if there is any tool for Ubuntu using which I can find out the IP address of person who is downloading from me or from whom I am downloading on LAN. I am on university LAN and we are using PtokaX and DC++ for file sharing on LAN. people sometimes put some offencive stuff on open chat on DC++ using some Nick and I don't know how to trace them while I am using Ubuntu.

    Read the article

  • upgrade from ubuntu 13.04 to 13.10 causes vmware workstation 9 problems

    - by dan
    so, upgrade caused problem where running vmware ws 9 needed patches to accommodate linux kernal 3.11. I applied those fixes I found that others reported, and now i can only run vmware ws 9 from sudo. If i run it from standard user, it says it wants to recompile modules, which it does not do unless I open up a terminal and run sudo vmware. that works, but would like it to work correctly, have the modules that are recompiled stick. when running under sudo vmware, it does recompile with errors.. (vmware-unity-helper:13019): Gtk-WARNING **: Unable to locate theme engine in module_path: "murrine", and starts up and works ok. any ideas? thanks for any help you can provide

    Read the article

  • Roll Your Own Wi-Fi Spy Camera

    - by Jason Fitzpatrick
    This fun DIY project allows you to roll your own Wi-Fi based spy camera and then, when it’s time for a new project, pull apart the modular design and build something new. This build combines an Arduino board, an Adafruit Data Logging Shield, an a serial-based camera (among a handful of small parts and open-source code) into a spy camera that remotely delivers the photos via Wi-Fi. The nice thing about this project is that when you can easily deconstruct the build to reuse the parts in a new project (the number of things you can do with an Arduino is near limitless). Hit up the link below for an excellent and well documented tutorial over at LadyAda.net. “Internet of Things” Camera [via DIYPhotography] Make Your Own Windows 8 Start Button with Zero Memory Usage Reader Request: How To Repair Blurry Photos HTG Explains: What Can You Find in an Email Header?

    Read the article

  • Pace Layering Comes Alive

    - by Tanu Sood
    v\:* {behavior:url(#default#VML);} o\:* {behavior:url(#default#VML);} w\:* {behavior:url(#default#VML);} .shape {behavior:url(#default#VML);} Normal 0 false false false EN-US X-NONE X-NONE MicrosoftInternetExplorer4 /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin:0in; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:10.0pt; font-family:"Calibri","sans-serif"; mso-bidi-font-family:"Times New Roman";} Rick Beers is Senior Director of Product Management for Oracle Fusion Middleware. Prior to joining Oracle, Rick held a variety of executive operational positions at Corning, Inc. and Bausch & Lomb. With a professional background that includes senior management positions in manufacturing, supply chain and information technology, Rick brings a unique set of experiences to cover the impact that technology can have on business models, processes and organizations. Rick hosts the IT Leaders Editorial on a monthly basis. By now, readers of this column are quite familiar with Oracle AppAdvantage, a unified framework of middleware technologies, infrastructure and applications utilizing a pace layered approach to enterprise systems platforms. 1. Standardize and Consolidate core Enterprise Applications by removing invasive customizations, costly workarounds and the complexity that multiple instances creates. 2. Move business specific processes and applications to the Differentiate Layer, thus creating greater business agility with process extensions and best of breed applications managed by cross- application process orchestration. 3. The Innovate Layer contains all the business capabilities required for engagement, collaboration and intuitive decision making. This is the layer where innovation will occur, as people engage one another in a secure yet open and informed way. 4. Simplify IT by minimizing complexity, improving performance and lowering cost with secure, reliable and managed systems across the entire Enterprise. But what hasn’t been discussed is the pace layered architecture that Oracle AppAdvantage adopts. What is it, what are its origins and why is it relevant to enterprise scale applications and technologies? It’s actually a fascinating tale that spans the past 20 years and a basic understanding of it provides a wonderful context to what is evolving as the future of enterprise systems platforms. It all begins in 1994 with a book by noted architect Stewart Brand, of ’Whole Earth Catalog’ fame. In his 1994 book How Buildings Learn, Brand popularized the term ‘Shearing Layers’, arguing that any building is actually a hierarchy of pieces, each of which inherently changes at different rates. In 1997 he produced a 6 part BBC Series adapted from the book, in which Part 6 focuses on Shearing Layers. In this segment Brand begins to introduce the concept of ‘pace’. Brand further refined this idea in his subsequent book, The Clock of the Long Now, which began to link the concept of Shearing Layers to computing and introduced the term ‘pace layering’, where he proposes that: “An imperative emerges: an adaptive [system] has to allow slippage between the differently-paced systems … otherwise the slow systems block the flow of the quick ones and the quick ones tear up the slow ones with their constant change. Embedding the systems together may look efficient at first but over time it is the opposite and destructive as well.” In 2000, IBM architects Ian Simmonds and David Ing published a paper entitled A Shearing Layers Approach to Information Systems Development, which applied the concept of Shearing Layers to systems design and development. It argued that at the time systems were still too rigid; that they constrained organizations by their inability to adapt to changes. The findings in the Conclusions section are particularly striking: “Our starting motivation was that enterprises need to become more adaptive, and that an aspect of doing that is having adaptable computer systems. The challenge is then to optimize information systems development for change (high maintenance) rather than stability (low maintenance). Our response is to make it explicit within software engineering the notion of shearing layers, and explore it as the principle that systems should be built to be adaptable in response to the qualitatively different rates of change to which they will be subjected. This allows us to separate functions that should legitimately change relatively slowly and at significant cost from that which should be changeable often, quickly and cheaply.” The problem at the time of course was that this vision of adaptable systems was simply not possible within the confines of 1st generation ERP, which were conceived, designed and developed for standardization and compliance. It wasn’t until the maturity of open, standards based integration, and the middleware innovation that followed, that pace layering became an achievable goal. And Oracle is leading the way. Oracle’s AppAdvantage framework makes pace layering come alive by taking a strategic vision 20 years in the making and transforming it to a reality. It allows enterprises to retain and even optimize their existing ERP systems, while wrapping around those ERP systems three layers of capabilities that inherently adapt as needed, at a pace that’s optimal for the enterprise.

    Read the article

  • How a .NET Programmer learn Big Data/Hadoop? [on hold]

    - by Smith Pascal Jr.
    I have been ASP.NET developer for sometime now and I have been reading a lot about Big Data- Hadoop and its future as to how it is the next technology in IT and how it would be useful to create million of jobs in US and elsewhere in the world. Now since Hadoop is an open source big data tool which is managed by Apache Server Foundation Group, I'm assuming I have to be well aware of JAVA - Correct me if I'm wrong. Moreover, How a .NET programmer can learn Big Data and its related technologies and can work professionally full time into this technology? What challenges and opportunities does a .NET professional face while changing the technology platform? Please advice. Thanks

    Read the article

  • Creating an anonymous site in SharePoint 2010

    - by shehan
    Here’s how: Open up the Central Administration site and click on “Manage Web Applications” under the “Application Management” section From the ribbon click on “New” (Note: if its an existing web app, then click on “Extend”) Fill in the fields with appropriate values. Under “Security Configurations” make sure to select “Yes” for “Allow Anonymous” Click OK Once the web application has been created, a site collection would need to be created. Navigate to “Application Management” –> “Create Site Collection” Fill in the fields with the appropriate values and create the site collection Next sign into the newly created site collection as the Site Collection Administrator. From the “Site Actions” menu, select “Site Permissions” In the permissions page that loads, click on the Anonymous Access button appearing on the ribbon. A modal dialog would popup. Select the appropriate option and click OK. If you selected “Entire Web Site” its advisable to restart the browser to test anonymous access Technorati Tags: SharePoint 2010,anonymous,site collection,web application

    Read the article

  • PostgreSQL 9.1, pgadmin III, Ubuntu 12.04 LTS, support functions

    - by Chaz SLiger
    When pgAdmin III is used to open a PostgreSQL database the following message appears. There does not seem to be any obvious package listed in the Ubuntu Software Center for this. The server lacks instrumentation functions. pgadmin III uses some support functions that are not available by default in all PostgreSQL versions. These enable some tasks that make life easier when dealing with log files and configuration files. The adminpack is installed and activated by default if you are running the one-click installer of PostgreSQL. On Unix, you may have to install the contrib package, either with your package installer tool or by compilation.

    Read the article

  • Windows Azure Virtual Machine Readiness and Capacity Assessment for SQL Server

    - by SQLOS Team
    Windows Azure Virtual Machine Readiness and Capacity Assessment for Windows Server Machine Running SQL Server With the release of MAP Toolkit 8.0 Beta, we have added a new scenario to assess your Windows Azure Virtual Machine Readiness. The MAP 8.0 Beta performs a comprehensive assessment of Windows Servers running SQL Server to determine you level of readiness to migrate an on-premise physical or virtual machine to Windows Azure Virtual Machines. The MAP Toolkit then offers suggested changes to prepare the machines for migration, such as upgrading the operating system or SQL Server. MAP Toolkit 8.0 Beta is available for download here Your participation and feedback is very important to make the MAP Toolkit work better for you. We encourage you to participate in the beta program and provide your feedback at [email protected] or through one of our surveys. Now, let’s walk through the MAP Toolkit task for completing the Windows Azure Virtual Machine assessment and capacity planning. The tasks include the following: Perform an inventory View the Windows Azure VM Readiness results and report Collect performance data for determine VM sizing View the Windows Azure Capacity results and report Perform an inventory: 1. To perform an inventory against a single machine or across a complete environment, choose Perform an Inventory to launch the Inventory and Assessment Wizard as shown below: 2. After the Inventory and Assessment Wizard launches, select either the Windows computers or SQL Server scenario to inventory Windows machines. HINT: If you don’t care about completely inventorying a machine, just select the SQL Server scenario. Click Next to Continue. 3. On the Discovery Methods page, select how you want to discover computers and then click Next to continue. Description of Discovery Methods: Use Active Directory Domain Services -- This method allows you to query a domain controller via the Lightweight Directory Access Protocol (LDAP) and select computers in all or specific domains, containers, or OUs. Use this method if all computers and devices are in AD DS. Windows networking protocols --  This method uses the WIN32 LAN Manager application programming interfaces to query the Computer Browser service for computers in workgroups and Windows NT 4.0–based domains. If the computers on the network are not joined to an Active Directory domain, use only the Windows networking protocols option to find computers. System Center Configuration Manager (SCCM) -- This method enables you to inventory computers managed by System Center Configuration Manager (SCCM). You need to provide credentials to the System Center Configuration Manager server in order to inventory the managed computers. When you select this option, the MAP Toolkit will query SCCM for a list of computers and then MAP will connect to these computers. Scan an IP address range -- This method allows you to specify the starting address and ending address of an IP address range. The wizard will then scan all IP addresses in the range and inventory only those computers. Note: This option can perform poorly, if many IP addresses aren’t being used within the range. Manually enter computer names and credentials -- Use this method if you want to inventory a small number of specific computers. Import computer names from a files -- Using this method, you can create a text file with a list of computer names that will be inventoried. 4. On the All Computers Credentials page, enter the accounts that have administrator rights to connect to the discovered machines. This does not need to a domain account, but needs to be a local administrator. I have entered my domain account that is an administrator on my local machine. Click Next after one or more accounts have been added. NOTE: The MAP Toolkit primarily uses Windows Management Instrumentation (WMI) to collect hardware, device, and software information from the remote computers. In order for the MAP Toolkit to successfully connect and inventory computers in your environment, you have to configure your machines to inventory through WMI and also allow your firewall to enable remote access through WMI. The MAP Toolkit also requires remote registry access for certain assessments. In addition to enabling WMI, you need accounts with administrative privileges to access desktops and servers in your environment. 5. On the Credentials Order page, select the order in which want the MAP Toolkit to connect to the machine and SQL Server. Generally just accept the defaults and click Next. 6. On the Enter Computers Manually page, click Create to pull up at dialog to enter one or more computer names. 7. On the Summary page confirm your settings and then click Finish. After clicking Finish the inventory process will start, as shown below: Windows Azure Readiness results and report After the inventory progress has completed, you can review the results under the Database scenario. On the tile, you will see the number of Windows Server machine with SQL Server that were analyzed, the number of machines that are ready to move without changes and the number of machines that require further changes. If you click this Azure VM Readiness tile, you will see additional details and can generate the Windows Azure VM Readiness Report. After the report is generated, select View | Saved Reports and Proposals to view the location of the report. Open up WindowsAzureVMReadiness* report in Excel. On the Windows tab, you can see the results of the assessment. This report has a column for the Operating System and SQL Server assessment and provides a recommendation on how to resolve, if there a component is not supported. Collect Performance Data Launch the Performance Wizard to collect performance information for the Windows Server machines that you would like the MAP Toolkit to suggest a Windows Azure VM size for. Windows Azure Capacity results and report After the performance metrics are collected, the Azure VM Capacity title will display the number of Virtual Machine sizes that are suggested for the Windows Server and Linux machines that were analyzed. You can then click on the Azure VM Capacity tile to see the capacity details and generate the Windows Azure VM Capacity Report. Within this report, you can view the performance data that was collected and the Virtual Machine sizes.   MAP Toolkit 8.0 Beta is available for download here Your participation and feedback is very important to make the MAP Toolkit work better for you. We encourage you to participate in the beta program and provide your feedback at [email protected] or through one of our surveys. Useful References: Windows Azure Homepage How to guides for Windows Azure Virtual Machines Provisioning a SQL Server Virtual Machine on Windows Azure Windows Azure Pricing     Peter Saddow Senior Program Manager – MAP Toolkit Team

    Read the article

  • Windows for IoT, continued

    - by Valter Minute
    Originally posted on: http://geekswithblogs.net/WindowsEmbeddedCookbook/archive/2014/08/05/windows-for-iot-continued.aspxI received many interesting feedbacks on my previous blog post and I tried to find some time to do some additional tests. Bert Kleinschmidt pointed out that pins 2,3 and 10 of the Galileo are connected directly to the SOC, while pin 13, the one used for the sample sketch is controlled via an I2C I/O expander. I changed my code to use pin 2 instead of 13 (just changing the variable assignment at the beginning of the code) and latency was greatly reduced. Now each pulse lasts for 1.44ms, 44% more than the expected time, but ways better that the result we got using pin 13. I also used SetThreadPriority to increase the priority of the thread that was running the sketch to THREAD_PRIORITY_HIGHEST but that didn't change the results. When I was using the I2C-controlled pin I tried the same and the timings got ways worse (increasing more than 10 times) and so I did not commented on that part, wanting to investigate the issua a bit more in detail. It seems that increasing the priority of the application thread impacts negatively the I2C communication. I tried to use also the Linux-based implementation (using a different Galileo board since the one provided by MS seems to use a different firmware) and the results of running the sample blink sketch modified to use pin 2 and blink the led for 1ms are similar to those we got on the same board running Windows. Here the difference between expected time and measured time is worse, getting around 3.2ms instead of 1 (320% compared to 150% using Windows but far from the 100.1% we got with the 8-bit Arduino). Both systems were not under load during the test, maybe loading some applications that use part of the CPU time would make those timings even less reliable, but I think that those numbers are enough to draw some conclusions. It may not be worth running a full OS if what you need is Arduino compatibility. The Arduino UNO is probably the best Arduino you can find to perform this kind of development. The Galileo running the Linux-based stack or running Windows for IoT is targeted to be a platform for "Internet of Things" devices, whatever that means. At the moment I don't see the "I" part of IoT. We have low level interfaces (SPI, I2C, the GPIO pins) that can be used to connect sensors but the support for connectivity is limited and the amount of work required to deliver some data to the cloud (using a secure HTTP request or a message queuing system like APMQS or MQTT) is still big and the rich OS underneath seems to not provide any help doing that.Why should I use sockets and can't access all the high level connectivity features we have on "full" Windows?I know that it's possible to use some third party libraries, try to build them using the Windows For IoT SDK etc. but this means re-inventing the wheel every time and can also lead to some IP concerns if used for products meant to be closed-source. I hope that MS and Intel (and others) will focus less on the "coolness" of running (some) Arduino sketches and more on providing a better platform to people that really want to design devices that leverage internet connectivity and the cloud processing power to deliver better products and services. Providing a reliable set of connectivity services would be a great start. Providing support for .NET would be even better, leaving native code available for hardware access etc. I know that those components may require additional storage and memory etc. So making the OS componentizable (or, at least, provide a way to install additional components) would be a great way to let developers pick the parts of the system they need to develop their solution, knowing that they will integrate well together. I can understand that the Arduino and Raspberry Pi* success may have attracted the attention of marketing departments worldwide and almost any new development board those days is promoted as "XXX response to Arduino" or "YYYY alternative to Raspberry Pi", but this is misleading and prevents companies from focusing on how to deliver good products and how to integrate "IoT" features with their existing offer to provide, at the end, a better product or service to their customers. Marketing is important, but can't decide the key features of a product (the OS) that is going to be used to develop full products for end customers integrating it with hardware and application software. I really like the "hackable" nature of open-source devices and like to see that companies are getting more and more open in releasing information, providing "hackable" devices and supporting developers with documentation, good samples etc. On the other side being able to run a sketch designed for an 8 bit microcontroller on a full-featured application processor may sound cool and an easy upgrade path for people that just experimented with sensors etc. on Arduino but it's not, in my humble opinion, the main path to follow for people who want to deliver real products.   *Shameless self-promotion: if you are looking for a good book in Italian about the Raspberry Pi , try mine: http://www.amazon.it/Raspberry-Pi-alluso-Digital-LifeStyle-ebook/dp/B00GYY3OKO

    Read the article

  • Digikam: What's the problem?

    - by Unapiedra
    I installed Digikam by using the Philip5-PPA. When I run it I get the error below. This is by running it through gdb: Starting program: /usr/bin/digikam /usr/bin/digikam: error while loading shared libraries: libcxcore.so.2.1: cannot open shared object file: No such file or directory [Inferior 1 (process 29894) exited with code 0177] What should I do to find the error and fix it? I can see that somehow libcxcore.so.2.1 is wanted but not found. Is this an error of the PPA, or can I simply point it in the right direction? Can I raise an issue with the PPA creator through launchpad? Some next steps would be quite helpful.

    Read the article

< Previous Page | 526 527 528 529 530 531 532 533 534 535 536 537  | Next Page >