Search Results

Search found 38931 results on 1558 pages for 'database testing'.

Page 706/1558 | < Previous Page | 702 703 704 705 706 707 708 709 710 711 712 713  | Next Page >

  • How Circuit Boards Are Manufactured and Tested [Video]

    - by Jason Fitzpatrick
    Circuit boards are in nearly everything: computers, cars, toys, phones, even greeting cards. Check out this tour of Printed Circuit Board (PCB) factory to see how they’re made. In the above video the owners of Base2 Electronics are watching a PCB testing machine at the factory where they purchase their boards for resale. The machine is first scanning the board to identify it in the board database and then the arms start flying as it tests individual circuits on the board. If you’re interested seeing all the steps of the manufacturing process, hit up the link below for a photo and video tour of the facility. Base2 Electronics Tour of Advanced Circuits [via Hack A Day] How To Encrypt Your Cloud-Based Drive with BoxcryptorHTG Explains: Photography with Film-Based CamerasHow to Clean Your Dirty Smartphone (Without Breaking Something)

    Read the article

  • CDbException: CDbConnection failed to open the DB connection

    - by Vinay Rajput
    Hi I am new to ubuntu and php mysql, i have intalled xampp and learning Yii, but while testing a script i got this problem, not able to figure out the solution, i have been through many forums solutions but none of them worked for me. Please help. 1) DbTest::testConnection CDbException: CDbConnection failed to open the DB connection. /opt/lampp/htdocs/YiiRoot/framework/db/CDbConnection.php:388 /opt/lampp/htdocs/YiiRoot/framework/db/CDbConnection.php:331 /opt/lampp/htdocs/YiiRoot/framework/db/CDbConnection.php:309 /opt/lampp/htdocs/YiiRoot/framework/base/CModule.php:388 /opt/lampp/htdocs/YiiRoot/framework/base/CModule.php:104 /opt/lampp/htdocs/trackstar/protected/tests/unit/DbTest.php:6 FAILURES! Tests: 1, Assertions: 0, Errors: 1.

    Read the article

  • Internet Explorer 10 Release Preview now available for Windows 7 SP1!

    - by KeithMayer
    This week, the IE team released IE 10 Release Preview for Windows 7 SP1 and Windows Server 2008 R2 SP1!  You can download IE10 Release Preview for evaluation and testing (remember, it's still pre-release software) from the following link location ... Download IE10 Release Preview: http://windows.microsoft.com/en-US/internet-explorer/downloads/ie-10/worldwide-languages You can get at overview of What's New in Internet Explorer 10 at: Internet Explorer 10 FAQ for IT Pros Of course, you can also get the full release of IE10 by downloading Windows 8 at http://aka.ms/dlw8rtm What's Next? After downloading IE10 Release Preview, begin setting up your lab environment to plan for how you'll customize and deploy IE10 in your environment when it's released with these resources: IE10 Customization and Administration Internet Explorer Administration Kit (IEAK) 10 Group Policy Settings Reference Hope this helps! Keith Build Your Lab! Download Windows Server 2012 Don’t Have a Lab? Build Your Lab in the Cloud with Windows Azure Virtual Machines Want to Get Certified? Join our Windows Server 2012 "Early Experts" Study Group

    Read the article

  • Poor mobile performance when running from Eclipse

    - by Yajirobe_LOL
    So after weeks of thinking my rendering code was bad, I accidentally discovered the following: Running my game on a Nexus S From Eclipse (Debug as - Android application): 12fps From the device while still attached to USB (getting log info in Eclipse still): 24fps From the device while not attached via USB: 56fps I was wondering if anyone else has issues like this? I mean, the problem really isn't a problem since the final release build will likely have good performance, but for the time being I don't want to have to keep (un)plugging my device in and out when testing code all day long. Is there some remedy for this or does anyone have any input/advice? Thanks.

    Read the article

  • My Right-to-Left Foot (T-SQL Tuesday #13)

    - by smisner
    As a business intelligence consultant, I often encounter the situation described in this month's T-SQL Tuesday, hosted by Steve Jones ( Blog | Twitter) – “What the Business Says Is Not What the  Business Wants.” Steve posed the question, “What issues have you had in interacting with the business to get your job done?” My profession requires me to have one foot firmly planted in the technology world and the other foot planted in the business world. I learned long ago that the business never says exactly what the business wants because the business doesn't have the words to describe what the business wants accurately enough for IT. Not only do technological-savvy barriers exist, but there are also linguistic barriers between the two worlds. So how do I cope? The adage "a picture is worth a thousand words" is particularly helpful when I'm called in to help design a new business intelligence solution. Many of my students in BI classes have heard me explain ("rant") about left-to-right versus right-to-left design. To understand what I mean about these two design options, let's start with a picture: When we design a business intelligence solution that includes some sort of traditional data warehouse or data mart design, we typically place the data sources on the left, the new solution in the middle, and the users on the right. When I've been called in to help course-correct a failing BI project, I often find that IT has taken a left-to-right approach. They look at the data sources, decide how to model the BI solution as a _______ (fill in the blank with data warehouse, data mart, cube, etc.), and then build the new data structures and supporting infrastructure. (Sometimes, they actually do this without ever having talked to the business first.) Then, when they show what they've built to the business, the business says that is not what we want. Uh-oh. I prefer to take a right-to-left approach. Preferably at the beginning of a project. But even if the project starts left-to-right, I'll do my best to swing it around so that we’re back to a right-to-left approach. (When circumstances are beyond my control, I carry on, but it’s a painful project for everyone – not because of me, but because the approach just doesn’t get to what the business wants in the most effective way.) By using a right to left approach, I try to understand what it is the business is trying to accomplish. I do this by having them explain reports to me, and explaining the decision-making process that relates to these reports. Sometimes I have them explain to me their business processes, or better yet show me their business processes in action because I need pictures, too. I (unofficially) call this part of the project "getting inside the business's head." This is starting at the right side of the diagram above. My next step is to start moving leftward. I do this by preparing some type of prototype. Depending on the nature of the project, this might mean that I simply mock up some data in a relational database and build a prototype report in Reporting Services. If I'm lucky, I might be able to use real data in a relational database. I'll either use a subset of the data in the prototype report by creating a prototype database to hold the sample data, or select data directly from the source. It all depends on how much data there is, how complex the queries are, and how fast I need to get the prototype completed. If the solution will include Analysis Services, then I'll build a prototype cube. Analysis Services makes it incredibly easy to prototype. You can sit down with the business, show them the prototype, and have a meaningful conversation about what the BI solution should look like. I know I've done a good job on the prototype when I get knocked out of my chair so that the business user can explore the solution further independently. (That's really happened to me!) We can talk about dimensions, hierarchies, levels, members, measures, and so on with something tangible to look at and without using those terms. It's not helpful to use sample data like Adventure Works or to use BI terms that they don't really understand. But when I show them their data using the BI technology and talk to them in their language, then they truly have a picture worth a thousand words. From that, we can fine tune the prototype to move it closer to what they want. They have a better idea of what they're getting, and I have a better idea of what to build. So right to left design is not truly moving from the right to the left. But it starts from the right and moves towards the middle, and once I know what the middle needs to look like, I can then build from the left to meet in the middle. And that’s how I get past what the business says to what the business wants.

    Read the article

  • SCSF for Visual Studio 2010

    - by Anthony Trudeau
    The Smart Client Software Factory (SCSF) version for Visual Studio 2010 is supposed to be released sometime this week.  The updated (final?) source code is available on the patterns & practices site already, but I'm guessing it could be updated again due to changes found during the final testing. You'll need the Visual Studio 2010 SDK as well as the new versions of the Guidance Automation Extensions (GAX) and the Guidance Automation Toolkit (GAT) for the SCSF. Here are the direct links for those installations: Visual Studio 2010 SDK Guidance Automation Extensions (GAX) Guidance Automation Toolkit (GAT)

    Read the article

  • how to assign web server and domain a public ip adress

    - by kdavis8
    i have installed an ISO image of windows server 2008 r2 onto my VMware workstation, as a virtual server. I am trying to host my own web server for testing purposes.I have Internet service with sprint and i called them to obtain my public ip address. Now that i have my public ip address how to i assign it to my server? I also have a web domain name that i would like to point it at that web server. Do i give it the public ip address or do i give it the name of the server?

    Read the article

  • rich snippets ignored by google [closed]

    - by Thoir Fáidh
    Possible Duplicate: Why would Google Rich Snippets work for one site author but not another? I'm facing one problem here. I made rich snippets - microdata for the website but google ignores all of them. Here is how it looks like in testing tool . It doesn't detect any errors. I've read that google ignores the microdata in hidden fields. Unfortunately this is partially the case since I use jquery to interact with the contect, but nevertheless it is not hidden everywhere and I believe that google should recognize at least the microdata visible to the user permanently. Am I missing something here? It is now about 3 weeks since I updated website with rich snippets.

    Read the article

  • creating a google wave clone using php/mysql/jquery

    - by jeansymolanza
    seasons greetings to all. i have a question that has been rather bugging me as of late. does anyone know how one can create a google wave clone using php/mysql/jquery as primary points of development. any ideas on how this might be possible and recommend any starting points? i have some time off work and it would be an interesting project to undertake as i want to use it in an e-learning framework next year. i will be testing the product on a XAMPP local server. i understand some of the technologies that google wave using but i am rather curious as to how these can be developed to a decent standard using php/mysql/jquery (i mention these three because i am quite adept at them). any links to resources best suited to an intermediate programmer would be appreciated. many thanks and God bless. so far i have this: http://konrness.com/javascript/google-wave-style-scroll-bar-jquery-plugin/

    Read the article

  • Running a Mongo Replica Set on Azure VM Roles

    - by Elton Stoneman
    Originally posted on: http://geekswithblogs.net/EltonStoneman/archive/2013/10/15/running-a-mongo-replica-set-on-azure-vm-roles.aspxSetting up a MongoDB Replica Set with a bunch of Azure VMs is straightforward stuff. Here’s a step-by-step which gets you from 0 to fully-redundant 3-node document database in about 30 minutes (most of which will be spent waiting for VMs to fire up). First, create yourself 3 VM roles, which is the minimum number of nodes you need for high availability. You can use any OS that Mongo supports. This guide uses Windows but the only difference will be the mechanism for starting the Mongo service when the VM starts (Windows Service, daemon etc.) While the VMs are provisioning, download and install Mongo locally, so you can set up the replica set with the Mongo shell. We’ll create our replica set from scratch, doing one machine at a time (if you have a single node you want to upgrade to a replica set, it’s the same from step 3 onwards): 1. Setup Mongo Log into the first node, download mongo and unzip it to C:. Rename the folder to remove the version – so you have c:\MongoDB\bin etc. – and create a new folder for the logs, c:\MongoDB\logs. 2. Setup your data disk When you initialize a node in a replica set, Mongo pre-allocates a whole chunk of storage to use for data replication. It will use up to 5% of your data disk, so if you use a Windows VM image with a defsault 120Gb disk and host your data on C:, then Mongo will allocate 6Gb for replication. And that takes a while. Instead you can create yourself a new partition by shrinking down the C: drive in Computer Management, by say 10Gb, and then creating a new logical disk for your data from that spare 10Gb, which will be allocated as E:. Create a new folder, e:\data. 3. Start Mongo When that’s done, start a command line, point to the mongo binaries folder, install Mongo as a Windows Service, running in replica set mode, and start the service: cd c:\mongodb\bin mongod -logpath c:\mongodb\logs\mongod.log -dbpath e:\data -replSet TheReplicaSet –install net start mongodb 4. Open the ports Mongo uses port 27017 by default, so you need to allow access in the machine and in Azure. In the VM, open Windows Firewall and create a new inbound rule to allow access via port 27017. Then in the Azure Management Console for the VM role, under the Configure tab add a new rule, again to allow port 27017. 5. Initialise the replica set Start up your local mongo shell, connecting to your Azure VM, and initiate the replica set: c:\mongodb\bin\mongo sc-xyz-db1.cloudapp.net rs.initiate() This is the bit where the new node (at this point the only node) allocates its replication files, so if your data disk is large, this can take a long time (if you’re using the default C: drive with 120Gb, it may take so long that rs.initiate() never responds. If you’re sat waiting more than 20 minutes, start another instance of the mongo shell pointing to the same machine to check on it). Run rs.conf() and you should see one node configured. 6. Fix the host name for the primary – *don’t miss this one* For the first node in the replica set, Mongo on Windows doesn’t populate the full machine name. Run rs.conf() and the name of the primary is sc-xyz-db1, which isn’t accessible to the outside world. The replica set configuration needs the full DNS name of every node, so you need to manually rename it in your shell, which you can do like this: cfg = rs.conf() cfg.members[0].host = ‘sc-xyz-db1.cloudapp.net:27017’ rs.reconfig(cfg) When that returns, rs.conf() will have your full DNS name for the primary, and the other nodes will be able to connect. At this point you have a working database, so you can start adding documents, but there’s no replication yet. 7. Add more nodes For the next two VMs, follow steps 1 through to 4, which will give you a working Mongo database on each node, which you can add to the replica set from the shell with rs.add(), using the full DNS name of the new node and the port you’re using: rs.add(‘sc-xyz-db2.cloudapp.net:27017’) Run rs.status() and you’ll see your new node in STARTUP2 state, which means its initializing and replicating from the PRIMARY. Repeat for your third node: rs.add(‘sc-xyz-db3.cloudapp.net:27017’) When all nodes are finished initializing, you will have a PRIMARY and two SECONDARY nodes showing in rs.status(). Now you have high availability, so you can happily stop db1, and one of the other nodes will become the PRIMARY with no loss of data or service. Note – the process for AWS EC2 is exactly the same, but with one important difference. On the Azure Windows Server 2012 base image, the MongoDB release for 64-bit 2008R2+ works fine, but on the base 2012 AMI that release keeps failing with a UAC permission error. The standard 64-bit release is fine, but it lacks some optimizations that are in the 2008R2+ version.

    Read the article

  • Thoughts on Development using Virtual Machines

    - by J_A_X
    I'll be working as a development lead for a startup and I've suggested that we use VMs for development. I'm not talking about each developer having a desktop with VMs for testing/development, I mean having a server rack where all VMs are managed and have the developers work from a microPC (ChromeOS anyone?) locally, or even remotely from their home computer. To me, the benefits are the fact that it's extremely scalable, cheaper in the long run, easier to manage and that we utilize the hardware its maximum potential. As for cons, I can't think of any particular showstoppers other than we'll need someone to setup/maintain said setup. I was hoping that some of you might of had a similar setup at your place of employment and be able to weight in with your opinions. Thanks.

    Read the article

  • Codestock: Apparently Powershell ain't got the power

    - by Theo Moore
    I checked on the status of voting on the Codestock (www.codestock.org) site this week. I was surpised to see that none of the Powershell sessions were among leaders in voting. Now, I confess that I am somewhat biased (my session is on Powershell), but that said, I thought it odd. I was under the impression that Powershell had a strong following and that many people were using it. I suppose the voting reflects a stronger developer community that might not make use of Powershell to degree some others might. I am a huge fan of Powershell and I am constantly impressed with the things it can do. In my case, I use it as lightweight functional testing harness for web pages. I use it in this capacity at work and for work I do for the Carbonated Comics (www.carbonatedcomics.com) site as well. If anyone still hasn't registered, do us a favor and vote for a Powershell session, K?

    Read the article

  • How can you become a competent web application security expert without breaking the law?

    - by hal10001
    I find this to be equivalent to undercover police officers who join a gang, do drugs and break the law as a last resort in order to enforce it. To be a competent security expert, I feel hacking has to be a constant hands-on effort. Yet, that requires finding exploits, testing them on live applications, and being able to demonstrate those exploits with confidence. For those that consider themselves "experts" in Web application security, what did you do to learn the art without actually breaking the law? Or, is this the gray area that nobody likes to talk about because you have to bend the law to its limits?

    Read the article

  • Rich Snippets - LocalBusiness - Photos - Correct Implementation

    - by user32622
    Does somebody know, how this is supposed to be implemented correctly? In my local business full page, I have a carousel with several images, so what I did is that on the container of this carousel i have written the following: "itemprop='photos' itemscope itemtype="http://schema.org/ImageObject"", i.e. <div class="tourism-product-media-gallery" itemprop='photos' itemscope itemtype="http://schema.org/ImageObject"> and then on each and every image i have written the following: "itemprop="contentURL"", i.e. <img src="@mediaItem.NormalImage" alt="@mediaItemCaption" itemprop="contentURL"/> But i am not convinced that this is the way it should be. Anyone has any insight on this and more knowledge? Thanks Note: here are the results from the rich snippet google testing tool: click here

    Read the article

  • Back from Istanbul - Presentations available for download

    - by Javier Puerta
    (Photo by Paul Thompson, 14-March-2012) On March 14-15th we have celebrated our 2012 Manageability Partner Community EMEA Forum, in Istanbul, Turkey. It has been an intense two days, packed with great content and a lot of networking. Organizing it jointly with the Exadata Partner Forum has allowed participants to benefit also from the content of the Exadata sessions, which is a key topic as an infrastructure building block as we move to cloud architectures. During the sessions we have listened to two thought-leaders in our industry, Ron Tolido, from Capgemni, and Julian Dontcheff, from Accenture. We thank our Manageability partner Capgemini/Sogeti,  for sharing with the community their experiences in developing their Testing offering based on Oracle products. The slide decks used in the presentations are now available for download at the Manageability Partner Community Collaborative Workspace (for community members only - if you get an error message, please register for the Community first) I want to thank all who have participated at the event, and look forward to meeting again at next year's Forum.

    Read the article

  • How to deal with ad-hoc mindsets?

    - by Rotian
    I joined a dev team of six two month ago. People are nice, all is good. But more and more I observe an ad-hoc mindset. Stuff gets quick fixed, at the cost of future usability, there is little testing and two people happily admitted, that they like to carry the knowledge around in their head, rather than to write it down. How to deal with this? I'd like to lead by example, but time is limited - I like architecting and actually implementing the stuff. But I'm afraid the ad-hoc mindset infects me and rather than striving for clearness and simplicity in design and code - which isn't simple to establish - I get pulled down the drain of an endless spiral of hacks on hacks - which no outsider can uncouple - just for schedule's and management's sake.

    Read the article

  • Saving and Loading the Game (Automatically or Manually) via Internal Storage Only (Tablet PC Issues)

    - by David Dimalanta
    Here is my question. When making a game app for Android, I considered first the device. It's no problem to save progress everything (from levels to records) on a smartphone because it has an SD Card slot. Exception to this, the tablet PC, it can really nothing but on internal only storage. For example, I'm using this tutorial for audio spectrum (see http://www.youtube.com/watch?v=5cN1VzZXcdo) that involves copying from internal to external in order to detect frequency. It works on the desktop but not on the Android device (Tablets only [i.e. Google Nexus Tablet]). Is there a way to optimize save/load game problems due to internal/external device issues? Plus, additionally, what's the reason why my device won't work on tablets, except the desktop, while testing the audio spectrum code and why? Also, is it the same with saving/loading game?

    Read the article

  • A tour of the GlassFish 3.1.2 DCOM support

    - by alexismp
    While we've mentioned the DCOM support in GlassFish 3.1.2 several times before, you'll probably find Byron's DCOM blog entry to be useful if you're using Windows as a deployment platform for your GlassFish cluster. Byron discusses how DCOM is used to communicate with remote Windows nodes participating in a GlassFish cluster, what Java libraries were used to wrap around DCOM, what new asadmin commands were addd (in particular validate-dcom) as well as some tips to make this all work on your specific environment. In addition to this blog post, you should considering reading the official product documentation : • Considerations for Using DCOM for Centralized Administration • Setting Up DCOM and Testing the DCOM Set Up

    Read the article

  • Can't remove the libpcap0.8 package

    - by Yogesh
    I am getting error when running apt-get remove root@System:~/Downloads# sudo apt-get remove The following packages have unmet dependencies: libpcap0.8 : Breaks: libpcap0.8:i386 (!= 1.4.0-2) but 1.5.3-2 is installed libpcap0.8:i386 : Breaks: libpcap0.8 (!= 1.5.3-2) but 1.4.0-2 is installed libpcap0.8-dev : Depends: libpcap0.8 (= 1.5.3-2) but 1.4.0-2 is installed E: Unmet dependencies. Try using -f. and when I ran apt-get remove -f this is what happens: root@System:~/Downloads# sudo apt-get remove -f The following extra packages will be installed: libpcap0.8 The following packages will be upgraded: libpcap0.8 1 upgraded, 0 newly installed, 0 to remove and 365 not upgraded. 2 not fully installed or removed. Need to get 0 B/110 kB of archives. After this operation, 13.3 kB of additional disk space will be used. Do you want to continue? [Y/n] y (Reading database ... 163539 files and directories currently installed.) Preparing to unpack .../libpcap0.8_1.5.3-2_amd64.deb ... Unpacking libpcap0.8:amd64 (1.5.3-2) over (1.4.0-2) ... dpkg: error processing archive /var/cache/apt/archives/libpcap0.8_1.5.3-2_amd64.deb (--unpack): trying to overwrite shared '/usr/share/man/man7/pcap-filter.7.gz', which is different from other instances of package libpcap0.8:amd64 dpkg-deb: error: subprocess paste was killed by signal (Broken pipe) Processing triggers for man-db (2.6.7.1-1) ... Errors were encountered while processing: /var/cache/apt/archives/libpcap0.8_1.5.3-2_amd64.deb E: Sub-process /usr/bin/dpkg returned an error code (1) root@System:~/Downloads# clear root@System:~/Downloads# sudo apt-get remove -f Reading package lists... Done Building dependency tree Reading state information... Done Correcting dependencies... Done The following extra packages will be installed: libpcap0.8 The following packages will be upgraded: libpcap0.8 1 upgraded, 0 newly installed, 0 to remove and 365 not upgraded. 2 not fully installed or removed. Need to get 0 B/110 kB of archives. After this operation, 13.3 kB of additional disk space will be used. Do you want to continue? [Y/n] y (Reading database ... 163539 files and directories currently installed.) Preparing to unpack .../libpcap0.8_1.5.3-2_amd64.deb ... Unpacking libpcap0.8:amd64 (1.5.3-2) over (1.4.0-2) ... dpkg: error processing archive /var/cache/apt/archives/libpcap0.8_1.5.3-2_amd64.deb (--unpack): trying to overwrite shared '/usr/share/man/man7/pcap-filter.7.gz', which is different from other instances of package libpcap0.8:amd64 dpkg-deb: error: subprocess paste was killed by signal (Broken pipe) Processing triggers for man-db (2.6.7.1-1) ... Errors were encountered while processing: /var/cache/apt/archives/libpcap0.8_1.5.3-2_amd64.deb E: Sub-process /usr/bin/dpkg returned an error code (1) root@System:~/Downloads# root@System:~/Downloads# sudo apt-get check Reading package lists... Done Building dependency tree Reading state information... Done You might want to run 'apt-get -f install' to correct these. The following packages have unmet dependencies: libpcap0.8 : Breaks: libpcap0.8:i386 (!= 1.4.0-2) but 1.5.3-2 is installed libpcap0.8:i386 : Breaks: libpcap0.8 (!= 1.5.3-2) but 1.4.0-2 is installed libpcap0.8-dev : Depends: libpcap0.8 (= 1.5.3-2) but 1.4.0-2 is installed E: Unmet dependencies. Try using -f. root@System:~/Downloads# apt-cache policy libpcap0.8:amd64 libpcap0.8 libpcap0.8-dev libpcap0.8: Installed: 1.4.0-2 Candidate: 1.5.3-2 Version table: 1.5.3-2 0 500 http://in.archive.ubuntu.com/ubuntu/ trusty/main amd64 Packages *** 1.4.0-2 0 100 /var/lib/dpkg/status libpcap0.8: Installed: 1.4.0-2 Candidate: 1.5.3-2 Version table: 1.5.3-2 0 500 http://in.archive.ubuntu.com/ubuntu/ trusty/main amd64 Packages *** 1.4.0-2 0 100 /var/lib/dpkg/status libpcap0.8-dev: Installed: 1.5.3-2 Candidate: 1.5.3-2 Version table: *** 1.5.3-2 0 500 http://in.archive.ubuntu.com/ubuntu/ trusty/main amd64 Packages 100 /var/lib/dpkg/status root@System:~/Downloads# root@System:~/Downloads# sudo apt-get -f remove libpcap0.8 libpcap0.8-dev libpcap0.8-dev:i386 libpcap0.8:i386 Reading package lists... Done Building dependency tree Reading state information... Done Package 'libpcap0.8-dev:i386' is not installed, so not removed. Did you mean 'libpcap0.8-dev'? You might want to run 'apt-get -f install' to correct these: The following packages have unmet dependencies: libpcap-dev : Depends: libpcap0.8-dev but it is not going to be installed E: Unmet dependencies. Try 'apt-get -f install' with no packages (or specify a solution). root@System:~/Downloads# sudo apt-get -f install Reading package lists... Done Building dependency tree Reading state information... Done Correcting dependencies... Done The following extra packages will be installed: libpcap0.8 The following packages will be upgraded: libpcap0.8 1 upgraded, 0 newly installed, 0 to remove and 365 not upgraded. 2 not fully installed or removed. Need to get 0 B/110 kB of archives. After this operation, 13.3 kB of additional disk space will be used. Do you want to continue? [Y/n] y (Reading database ... 163539 files and directories currently installed.) Preparing to unpack .../libpcap0.8_1.5.3-2_amd64.deb ... Unpacking libpcap0.8:amd64 (1.5.3-2) over (1.4.0-2) ... dpkg: error processing archive /var/cache/apt/archives/libpcap0.8_1.5.3-2_amd64.deb (--unpack): trying to overwrite shared '/usr/share/man/man7/pcap-filter.7.gz', which is different from other instances of package libpcap0.8:amd64 dpkg-deb: error: subprocess paste was killed by signal (Broken pipe) Processing triggers for man-db (2.6.7.1-1) ... Errors were encountered while processing: /var/cache/apt/archives/libpcap0.8_1.5.3-2_amd64.deb E: Sub-process /usr/bin/dpkg returned an error code (1) root@System:~/Downloads#

    Read the article

  • Upgrading to Gnome 3.4 breaks Unity and gnome-shell

    - by mac
    I have upgraded my gnome shell to 3.4 in Ubuntu 11.10 through sudo add-apt-repository ppa:ricotz/testing sudo add-apt-repository ppa:gnome3-team/gnome3 sudo apt-get update && sudo apt-get dist-upgrade sudo apt-get install gnome-shell But it broke my system. Gnome shell is completely broken - When I login it just shows desktop wallpaper and nothing else. And importantly Unity is also broken. Attaching the screenshot Some main issues 1)Two menus are appearing now - Global menu as well as application menu 2)Icons on top-right panel are appearing weirdly 3)My Default Ambiance Theme also got screwed. Instead of black color menus, I am seeing white color menus. How do I fix them? Or Do I have an option to revert back to original settings or will reinstalling Unity/Gnome Shell helps ?

    Read the article

  • Ubuntu and VirtualBox

    - by Sinan
    I have the following configuration, A host running Windows 7; A guest running Ubuntu 14.04 LTS (VirtualBox); I am connecting a Cisco router directly to my PC running Windows 7 and testing the router for netflow packets in the virtualBox I am having a difficulty capturing the traffic of the netflow from the Cisco device in my virtualBox using port 2222. I tried to use the many different networking modes provided by virutalBox (i.e. NAT, Bridged Adapter, Host only adapter) but I am not successful in capturing the netflow traffic. Could you please advise me on the configuration setup that need to be done on the virtual box to allow capturing the traffic coming from the router. I successfully capture the netflow traffic on my PC (windows 7). Thank you

    Read the article

  • SQLAuthority News – Great Time Spent at Great Indian Developers Summit 2014

    - by Pinal Dave
    The Great Indian Developer Conference (GIDS) is one of the most popular annual event held in Bangalore. This year GIDS is scheduled on April 22, 25. I will be presented total four sessions at this event and each session is very different from each other. Here are the details of four of my sessions, which I presented there. Pluralsight Shades This event was a great event and I had fantastic fun presenting a technology over here. I was indeed very excited that along with me, I had many of my friends presenting at the event as well. I want to thank all of you to attend my session and having standing room every single time. I have already sent resources in my newsletter. You can sign up for the newsletter over here. Indexing is an Art I was amazed with the crowd present in the sessions at GIDS. There was a great interest in the subject of SQL Server and Performance Tuning. Audience at GIDS I believe event like such provides a great platform to meet and share knowledge. Pinal at Pluralsight Booth Here are the abstract of the sessions which I had presented. They were recorded so at some point in time they will be available, but if you want the content of all the courses immediately, I suggest you check out my video courses on the same subject on Pluralsight. Indexes, the Unsung Hero Relevant Pluralsight Course Slow Running Queries are the most common problem that developers face while working with SQL Server. While it is easy to blame SQL Server for unsatisfactory performance, the issue often persists with the way queries have been written, and how Indexes has been set up. The session will focus on the ways of identifying problems that slow down SQL Server, and Indexing tricks to fix them. Developers will walk out with scripts and knowledge that can be applied to their servers, immediately post the session. Indexes are the most crucial objects of the database. They are the first stop for any DBA and Developer when it is about performance tuning. There is a good side as well evil side to indexes. To master the art of performance tuning one has to understand the fundamentals of indexes and the best practices associated with the same. We will cover various aspects of Indexing such as Duplicate Index, Redundant Index, Missing Index as well as best practices around Indexes. SQL Server Performance Troubleshooting: Ancient Problems and Modern Solutions Relevant Pluralsight Course Many believe Performance Tuning and Troubleshooting is an art which has been lost in time. However, truth is that art has evolved with time and there are more tools and techniques to overcome ancient troublesome scenarios. There are three major resources that when bottlenecked creates performance problems: CPU, IO, and Memory. In this session we will focus on High CPU scenarios detection and their resolutions. If time permits we will cover other performance related tips and tricks. At the end of this session, attendees will have a clear idea as well as action items regarding what to do when facing any of the above resource intensive scenarios. Developers will walk out with scripts and knowledge that can be applied to their servers, immediately post the session. To master the art of performance tuning one has to understand the fundamentals of performance, tuning and the best practices associated with the same. We will discuss about performance tuning in this session with the help of Demos. Pinal Dave at GIDS MySQL Performance Tuning – Unexplored Territory Relevant Pluralsight Course Performance is one of the most essential aspects of any application. Everyone wants their server to perform optimally and at the best efficiency. However, not many people talk about MySQL and Performance Tuning as it is an extremely unexplored territory. In this session, we will talk about how we can tune MySQL Performance. We will also try and cover other performance related tips and tricks. At the end of this session, attendees will not only have a clear idea, but also carry home action items regarding what to do when facing any of the above resource intensive scenarios. Developers will walk out with scripts and knowledge that can be applied to their servers, immediately post the session. To master the art of performance tuning one has to understand the fundamentals of performance, tuning and the best practices associated with the same. You will also witness some impressive performance tuning demos in this session. Hidden Secrets and Gems of SQL Server We Bet You Never Knew Relevant Pluralsight Course SQL Trio Session! It really amazes us every time when someone says SQL Server is an easy tool to handle and work with. Microsoft has done an amazing work in making working with complex relational database a breeze for developers and administrators alike. Though it looks like child’s play for some, the realities are far away from this notion. The basics and fundamentals though are simple and uniform across databases, the behavior and understanding the nuts and bolts of SQL Server is something we need to master over a period of time. With a collective experience of more than 30+ years amongst the speakers on databases, we will try to take a unique tour of various aspects of SQL Server and bring to you life lessons learnt from working with SQL Server. We will share some of the trade secrets of performance, configuration, new features, tuning, behaviors, T-SQL practices, common pitfalls, productivity tips on tools and more. This is a highly demo filled session for practical use if you are a SQL Server developer or an Administrator. The speakers will be able to stump you and give you answers on almost everything inside the Relational database called SQL Server. I personally attended the session of Vinod Kumar, Balmukund Lakhani, Abhishek Kumar and my favorite Govind Kanshi. Summary If you have missed this event here are two action items 1) Sign up for Resource Newsletter 2) Watch my video courses on Pluralsight Reference: Pinal Dave (http://blog.sqlauthority.com)Filed under: MySQL, PostADay, SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, SQLAuthority Author Visit, SQLAuthority News, T SQL Tagged: GIDS

    Read the article

  • Cowboy Agile?

    - by Robert May
    In a previous post, I outlined the rules of Scrum.  This post details one of those rules. I’ve often heard similar phrases around Scrum that clue me in to someone who doesn’t understand Scrum.  The phrases go something like this: “We don’t do Agile because the idea of letting people just do whatever they want is wrong.  We believe in a more structured approach.” (i.e. Work is Prison, and I’m the Warden!) “I love Agile.  Agile lets us do whatever we want!” (Cowboy Agile?) “We’re Agile, but we use a process that I’ve created.” (Cowboy Agile?) All of those phrases have one thing in common:  The assumption that Agile, and I mean Scrum, lets you do whatever you want.  This is simply not true. Executing Scrum properly requires more dedication, rigor, and diligence than happens in most traditional development methods. Scrum and Waterfall Compared Since Scrum and Waterfall are two of the most commonly used methodologies, a little bit of contrasting and comparing is in order. Waterfall Scrum A project manager defines all tasks and then manages the tasks that team members are working on. The team members define the tasks and estimates of the stories for the current iteration.  Any team member may work on any task in the iteration. Usually only a few milestones that need to be met, the milestones are measured in months, and these milestones are expected to be missed.  Little work is ever done to improve estimates and poor estimators can hide behind high estimates. Stories must be delivered every iteration, milestones are measured in hours, and the team is expected to figure out why their estimates were wrong, even when they were under.  Repeated misses can get the entire team fired. Partially completed work is normal. Partially completed work doesn’t count. Nobody knows the task you’re working on. Everyone knows what you’re working on, whether or not you’re making progress and how much longer you think its going to take, in hours. Little requirement to show working code.  Prototypes are ok. Working code must be shown each iteration.  No smoke and mirrors allowed.  Testing is done in lengthy cycles at the end of development.  Developers aren’t held accountable. Testing is part of the team.  If the testers don’t accept the story as complete, the team can’t count it.  Complete means that the story’s functionality works as designed.  The team can’t have any open defects on the story. Velocity is rarely truly measured and difficult to evaluate. Velocity is integral to the process and can be seen at a glance and everyone in the company knows what it is. A business analyst writes requirements.  Designers mock up screens.  Developers hide behind “I did it just like the spec doc told me to and made the screen exactly like the picture” Developers are expected to collaborate in real time.  If a design is bad or lacks needed details, the developers are required to get it right in the iteration, because all software must be functional.  Designers and Business Analysts are part of the team and must do their work in iterations slightly ahead of the developers. Upper Management is often surprised.  “You told me things were going well two months ago!” Management receives updates at the end of every iteration showing them exactly what the team did and how that compares to what' is remaining in the backlog.  Managers know every iteration what their money is buying. Status meetings are rare or don’t occur.  Email is a primary form of communication. Teams coordinate every single day with each other and use other high bandwidth communication channels to make sure they’re making progress.  Email is used only as a last resort.  Instead, team members stand up, walk to each other, and talk, face to face.  If that’s not possible, they pick up the phone. IF someone asks what happened, its at the end of a lengthy development cycle measured in months, and nobody really knows why it happened. Someone asks what happened every iteration.  The team talks about what happened, and then adapts to make sure that what happened either never happens again or happens every time.   That’s probably enough for now.  As you can see, a lot is required of Scrum teams! One of the key differences in Scrum is that the burden for many activities is shifted to a group of people who share responsibility, instead of a single person having responsibility.  This is a very good thing, since small groups usually come up with better and more insightful work than single individuals.  This shift also results in better velocity.  Team members can take vacations and the rest of the team simply picks up the slack.  With Waterfall, if a key team member takes a vacation, delays can ensue. Scrum requires much more out of every team member and as a result, Scrum teams outperform non-Scrum teams working 60 hour weeks. Recommended Reading Everyone considering Scrum should read Mike Cohn’s excellent book, User Stories Applied. Technorati Tags: Agile,Scrum,Waterfall

    Read the article

  • ADF Sessions at RMOUG this week

    - by shay.shmeltzer
    If you are attending the RMOUG conference this week, you might be interested in checking out some of the sessions we are doing about Oracle ADF:Lynn is delivering:The Fusion Development Platform - Wed at 9:00 (404)Put Your Good Taste Into Action: How to Skin ADF Faces Rich Client Applications - Wed 5:00 (4 c/d)Shay is delivering:From SQL to Rich Web Data Visualization - The Fast Route - Thu at 9:00 (404)Adding Mobile and Web 2.0 UIs to Existing Applications - The Fusion Way - Thu at 10:15 (404)There are also lots of ADF related sessions delivered by customers and partners including:Drinking the Kool-Aid - My Journey to Becoming an ADF BelieverCase Study: Performance Tuning New ADF Applications Using Oracle Application Testing Suite (ATS)Oracle ADF & JDeveloper: Coming of AgeHello Worldwide Web: Your First JSF in JDeveloperMore details see the schedule here.If you are using ADF already, please drop by and let us know what you think. We are always looking for user feedback.

    Read the article

  • Interfaces Reference Model available

    - by ACShorten
    With the implementation of an Oracle Utilities Application Framework based products, you can implement other Oracle technologies to augment your solution. There is a whitepaper available now to outline all the technology integrations possible with various versions of the Oracle Utilities Application Framework. The whitepaper outlines the possible integrations and implementations of other Oracle technologies to address customer requirements in association with Oracle Utilities Application Framework based products. The whitepaper covers a vast range of products including: Oracle Fusion Middleware Oracle SOA Suite Oracle Identity Management Suite Oracle ExaData and Oracle ExaLogic Oracle VM Data Options including Real Application Clustering, Real Application Testing, Data Guard/Active Data Guard, Compression, Partitioning, Database Vault, Audit Vault etc.. The whitepaper contains a summary of the integration solution possibilities, links to further information including product specific interfaces. The whitepaper is available from My Oracle Support at KB Id: 1506855.1

    Read the article

< Previous Page | 702 703 704 705 706 707 708 709 710 711 712 713  | Next Page >