Search Results

Search found 32454 results on 1299 pages for 'google webmaster tools'.

Page 110/1299 | < Previous Page | 106 107 108 109 110 111 112 113 114 115 116 117  | Next Page >

  • With google maps, can you use google's own popup windows?

    - by SLC
    I've implemented a google map with points and stuff that uses an address that the user inputs. When you click a point, the popup bubble appears with the name and address in. Often this name and address is a prominent location, as it's used for meetings and things, such as a university. If you google the address yourself on maps.google.co.uk then you get google's own popup bubble, which often has a photo, information, opening hours, links to directions, reviews, etc. etc. I am wondering if there's a way to use that popup dialog instead of my own, where it is available. I can't see anything in the API to do this. I'm using V2 as we support IE6 in a lot of our users, but I've been told recently I can upgrade to V3 should I need functionality from it. Any ideas?

    Read the article

  • Google Apps for Domains, Multiple Domains

    - by belliez
    I have a primary google apps for domains account which I use for my personal email, calender, docs etc and is great. I also receive my pop3 company email via settings-Get mail from other accounts in my account. Due to spam I want to make use of gmail servers for my company email and have two options: [1] Add my second domain as a domain alias [2] Create a new apps for domains account If I do [1] above do I access (send and receive) my company email as if it was a separate account or is it merged into my primary domain. I want the two seperated. If I perform [2] can I share my contacts / calender between the two? I also have Act! contact manager which syncs to my primary domain and it is getting messy now with personal and work contacts being changed / sync'd to my Act CM software. I want to try and separate my personal and work contacts (but make the work them avaiable in my primary domain). Hope this makes sense! Your suggestions are gratefully accepted. Thank you

    Read the article

  • I love google Chrome, but some non-static pages like Piwik render it unresponsive

    - by gogowitsch
    The web-stat software Piwik stops reacting on mouse clicks after 1-2 seconds. The same is true for Google Maps and Producteev (but GMail and most other pages work like a charm). These rely heavily on JS, and work without Flash. I can click for a very short time period and then the mouse cursor doesn't feel the UI anymore (it doesn't turn into a I over input fields, though it moves; if the freeze occured while the pointer was over an input field, the cursor keeps being a I) and all clicks on the DOM are being ignored by Chrome. No message appears, neither obvious nor in the Console (F12). There is no obstructing div or the like in the DOM (F12). Since I couldn't find any hints on the source of my problems, I suspected my plugins and extensions. Unfortunately, neither deactivating all plugins nor all extensions solved the problem. for the problematic pages, it always happens no Dropbox running several GB of free RAM the taskmanager doesn't show any high CPU or memory utilization (the offending tab uses 30 MB and uses 0-1 % CPU) all problematic pages work in other browsers (Chrome, Firefox, IE) the rest of the computer is very responsive the computers use different security suites (Kaspersky and Avira) The effect exists between several (synchronized) Chrome instances on different machines, all running Windows 7. Both the OS and Chrome are updated automatically. Other tabs and the Chrome chrome (tabs, menus, toolbar buttons of the browser itself) still work. I really don't like switching between browsers. Any ideas?

    Read the article

  • Google Chrome not loading web pages correctly unless multiple refreshes

    - by Brandon Wilson
    Webpages in Google Chrome do not load correctly from time to time. I can't reproduce it, it just happens. Some times it happens when I load the browser other times it happens when I am just browsing. Just now I went to five different web sites which 3 out of 5 of them did not load correctly. I have attached a photo of how Super User loaded the first time I loaded it. If I refreshed it it will load correctly. Facebook is bad like this. Some times Facebook will load correctly but some of there back end scripting may not load so the page may not refresh automatically. Not sure what is going on. I have tried other browsers (Firefox and Internet Explorer) and they seem to be working correctly. Chrome seems to be acting up only on this computer. All my computers are running Windows 8 and I have removed Chrome completely off this computer and re-installed. I even disabled all extensions and cleared all the caches. I even tried running Chrome without being logged in. Not sure what else to do at this point. An example of superuser.com not loading correctly: When I refresh the problem will go away until it happens again. Sometimes it takes two or three refreshes in order for it to correctly load.

    Read the article

  • Cannot ping Google Public DNS on 8.8.8.8

    - by Tibor
    I have a weird problem on my Windows 7 (x64) computer. I seem to cannot ping the Google Public DNS on one of its addresses (while the other works fine). The peculiar thing is that it fails with the General failure. error message which usually means that there is a problem with a network adapter/base connectivity and not a timeout as one would expect. I checked my routing tables for any anomalies and I even flushed them but the problem seems unrelated. All the other hosts I tried ping fine (either respond or timeout). If I try to tracert or connect to the address via browser (yes, I know that it doesn't listen on port 80), it also fails instantaneously. The reason I need to ping 8.8.8.8 is that I commonly use it as a test of Internet conectivity due to it being rememberable. The problem occurs no matter where I connect to the Internet (it is a laptop computer). What could be the cause of this anomaly? Note: I use native IPv6 connectivity.

    Read the article

  • Postfix: Using google apps for stmp errors

    - by Zed Said
    I am using postfix and need to send the mail using google apps smtp. I am getting errors after I thought I had set everything up correctly: May 11 09:50:57 zedsaid postfix/error[22214]: 00E009693FB: to=<[email protected]>, relay=none, delay=2466, delays=2462/3.4/0/0.06, dsn=4.7.0, status=deferred (delivery temporarily suspended: SASL authentication failed; cannot authenticate to server smtp.gmail.com[74.125.155.109]: no mechanism available) May 11 09:50:57 zedsaid postfix/error[22213]: 0ACB36D1B94: to=<[email protected]>, relay=none, delay=2486, delays=2482/3.4/0/0.06, dsn=4.7.0, status=deferred (delivery temporarily suspended: SASL authentication failed; cannot authenticate to server smtp.gmail.com[74.125.155.109]: no mechanism available) May 11 09:50:57 zedsaid postfix/error[22232]: 067379693D3: to=<[email protected]>, relay=none, delay=2421, delays=2417/3.4/0/0.06, dsn=4.7.0, status=deferred (delivery temporarily suspended: SASL authentication failed; cannot authenticate to server smtp.gmail.com[74.125.155.109]: no mechanism available) main.cf: # Debian specific: Specifying a file name will cause the first # line of that file to be used as the name. The Debian default # is /etc/mailname. #myorigin = /etc/mailname smtpd_banner = $myhostname ESMTP $mail_name (Debian/GNU) biff = no # appending .domain is the MUA's job. append_dot_mydomain = no # Uncomment the next line to generate "delayed mail" warnings #delay_warning_time = 4h readme_directory = no # TLS parameters #smtpd_tls_cert_file=/etc/ssl/certs/ssl-cert-snakeoil.pem #smtpd_tls_key_file=/etc/ssl/private/ssl-cert-snakeoil.key smtpd_use_tls=yes smtpd_tls_session_cache_database = btree:${data_directory}/smtpd_scache smtp_tls_session_cache_database = btree:${data_directory}/smtp_scache # See /usr/share/doc/postfix/TLS_README.gz in the postfix-doc package for # information on enabling SSL in the smtp client. myhostname = zedsaid.com alias_maps = hash:/etc/aliases alias_database = hash:/etc/aliases myorigin = /etc/mailname mydestination = #relayhost = mynetworks = 127.0.0.0/8 [::ffff:127.0.0.0]/104 [::1]/128 mailbox_command = procmail -a "$EXTENSION" mailbox_size_limit = 0 recipient_delimiter = + inet_interfaces = all delay_warning_time = 4h smtpd_recipient_limit = 16 # how many error before back off. smtpd_soft_error_limit = 3 # how many max errors before blocking it. smtpd_hard_error_limit = 12 ## Gmail Relay relayhost = [smtp.gmail.com]:587 smtp_use_tls = yes smtp_sasl_auth_enable = yes smtp_sasl_password_maps = hash:/etc/postfix/sasl_passwd smtp_sasl_security_options = noanonymous smtp_sasl_tls_security_options = noanonymous smtp_sasl_mechanism_filter = login smtp_tls_eccert_file = smtp_tls_eckey_file = smtp_use_tls = yes smtp_enforce_tls = no smtp_tls_CAfile = /etc/postfix/cacert.pem smtpd_tls_received_header = yes tls_random_source = dev:/dev/urandom transport_maps = hash:/etc/postfix/transport debug_peer_list = smtp.gmail.com debug_peer_level = 3 What am I doing wrong?

    Read the article

  • Where to put the SPF TXT record?

    - by YellowSquirrel
    I've set up Google apps for my domain: I've registered the domain with Google by adding the CNAME Google asked and I've apparently succesfully setup the MX Google mail servers. So far I haven't yet a dedicated server: I'm just having a domain at a registrar. Now I want to activate SPF and I'm confused. In the following short webpage: http://www.google.com/support/a/bin/answer.py?answer=178723 it is written that I must add a TXT record containing: v=spf1 include:_spf.google.com ~all Where should I enter this? Should this go in the zone (?) file, like I did for the CNAME and the MX records? So far I have something like this: @ 10800 IN A 217.42.42.42 @ 10800 IN MX 5 ASPMX3.GOOGLEMAIL.COM. @ 10800 IN MX 5 ASPMX2.GOOGLEMAIL.COM. @ 10800 IN MX 3 ALT2.ASPMX.L.GOOGLE.COM. @ 10800 IN MX 3 ALT1.ASPMX.L.GOOGLE.COM. @ 10800 IN MX 1 ASPMX.L.GOOGLE.COM. google8a70835987f31e34 10800 IN CNAME google.com. Does adding the SPF TXT record mean I should literally have something like that: @ 10800 IN A 217.42.42.42 @ 10800 IN MX 5 ASPMX3.GOOGLEMAIL.COM. @ 10800 IN MX 5 ASPMX2.GOOGLEMAIL.COM. @ 3600 IN TXT "v=spf1 include:_spf.google.com ~all" @ 10800 IN MX 3 ALT2.ASPMX.L.GOOGLE.COM. @ 10800 IN MX 3 ALT1.ASPMX.L.GOOGLE.COM. @ 10800 IN MX 1 ASPMX.L.GOOGLE.COM. google8a70835987f31e34 10800 IN CNAME google.com. I made that one up and included right in the middle to show how confused I am. What I'd like to know is the exact syntax and where/how I should put this TXT record.

    Read the article

  • How to see the properties of a DOM element as they change in realtime?

    - by allquixotic
    JavaScript code can update the properties/attributes of DOM elements in real time by responding to events and so on. Here is an example. In the table on that page, move your mouse over the cells. Notice how they change color when the mouse is on them, and the color goes away when you move the mouse to another cell. Now, using Firefox or Chrome (but not IE, Opera, etc.), I want to examine the background color, expressed in RGB or hex or whatever, of the cells updated in real time, as the mouse cursor enters and leaves the region and causes the JS to do its thing. The behavior that I observe, currently, is that the Inspect Element functionality of both Firefox and Chrome does not update the value of the properties as they are updated by JavaScript. So, in order to view the latest value of the property, I have to inspect the element again, and it takes a momentary "snapshot" of the values. But since the values only change while I have the mouse on them, I can't take a snapshot of the value I want while my mouse cursor is over the cell, because I have to remove my mouse from the cell to select the "Inspect Element" item in the right-click list! If it is possible to have the values updated in real time using either Firefox or Chrome, or an extension, on any recent version of the software (up to the latest stable), please provide instructions for doing so.

    Read the article

  • Google 2-step verification: Should my phone know my password? [closed]

    - by Sir Code-A-Lot
    Hi, Just enabled 2-step verification for my Google account. I have installed Google Authenticator on my Android phone, and I set up an application specific password for the Google account associated on my phone. This works great when just using installed apps like Gmail, Calendar and Google Reader. But if I want to access Google Docs, Google Tasks or any other website that requires a Google login, I don't seem to be able to use a application specific password. I have to use my real password and then use Google Authenticator to make a code for the next step. This means if my phone is stolen, revoking the password to my phone is pointless. The phone have already been verified, and all that is needed is my password, which the phones browser will have remembered. I realize that I can take measures to ensure the phones browser doesn't remember my password, but that's just not convenient at all. Am I missing something, or is there no elegant solution to this? Should I just let my phone know my real password? As I see it, being able to login with application specific passwords on websites (which apparently isn't possible) is the only way I can revoke my phones access in a meaningful way.

    Read the article

  • Does Google sometimes ignore "special" characters, possibly depending on your location or font type settings? [closed]

    - by RLH
    TLDR Google tends to ignore special characters in my search strings. Is there anything that I can do about it and is it, possibly, happening because Google makes certain assumptions based off of my default text-encoding settings and my location? I just posted this question over at StackOverflow. I had found a C preprocessor that I'd never seen before. As I should have done, I Googled it and tried to find out further information. I attempted various search terms which were all variations of "C Operator ##" (some times with and some times without the double-quotes.) Google didn't bring back anything of use so I posted my question on SO. As you can see from the comments, someone mentioned a search string (ironically one which I did try to search) and stated that I could have even hit the "I'm feeling lucky" button and have gotten my answer. The problem is I did search that, and the results that I received were far more basic and even after following the top results and searching the resulting pages, I could find nothing referencing the string "##". I'm not posting this question to complain but it does provide an empirical example of something I've seen before that really bugs me-- Google often ignores special characters in my search strings and the results are often useless. As a developer I often need to search for string values containing non-alphanumeric characters. Some characters (like the underscore or hyphen) can be used without trouble. However, other characters (such as the ampersand, carat, tilde and pound sign) are often ignored in my query strings. Is there a way to prevent this from happening so that I can get meaningful results from Google? NOTE I stay logged into Google and I live in the US. I wonder if Google detects some form of text-encoding setting or derives my results based off of certain, localized text-based assumptions. Regardless, I would like to for Google to search for what I give it. Is there anything that I can do to improve my results?

    Read the article

  • Migrate active directory to Google apps for business

    - by dewnix
    I've got a problem migrating active directory to Gapps. I'm stuck on google apps directory sync (GADS) where it just gives the error "java.lang.NullPointerException" after testing the connection during the LDAP configuration step. I checked the logs and I've pretty much determined that port 389 (standard LDAP port) isn't listening on the exchange server. I've tried telneting to it (from another machine in the same network) with no luck but I can telnet to other ports, that i know are open, successfully. I know they're open because I used portqry and netstat to see them. I'm suspecting that the active directory isn't even installed/running on this machine because there's no active directory services at all running on it. There's no active directory services that say they're NOT running either though. Is it possible AD is installed somewhere else? does it have to be on a machine inside the same network? I found the domain controller and it's host name and when I telnet with port 389, it works however GADS still gives me the same exact error when I substitute that server in. Actually, no matter what ridiculous settings i put into GADS, i still get that same NullPointer error. If i could get some different error than that NullPointer, i'd call that a successful day.

    Read the article

  • Error while installing vmware tools v8.8.2 in Ubuntu 12.04 beta

    - by Dipen Patel
    I just upgraded to Ubuntu 12.04 from 11.10 using update manager. I use it as virtual machine on VMWare Player 4.xx. As usual I installed vmware tools to enable full screen mode and shared folder functionality. But while installing I got an error while building modules for shared folder and fast networking utilities for vmware tools. Error is ============================================== /tmp/vmware-root/modules/vmhgfs-only/fsutil.c: In function ‘HgfsChangeFileAttributes’: /tmp/vmware-root/modules/vmhgfs-only/fsutil.c:610:4: error: assignment of read-only member ‘i_nlink’ make[2]: *** [/tmp/vmware-root/modules/vmhgfs-only/fsutil.o] Error 1 make[2]: *** Waiting for unfinished jobs.... /tmp/vmware-root/modules/vmhgfs-only/file.c:128:4: warning: initialization from incompatible pointer type [enabled by default] /tmp/vmware-root/modules/vmhgfs-only/file.c:128:4: warning: (near initialization for ‘HgfsFileFileOperations.fsync’) [enabled by default] /tmp/vmware-root/modules/vmhgfs-only/tcp.c:53:30: error: expected ‘)’ before numeric constant /tmp/vmware-root/modules/vmhgfs-only/tcp.c:56:25: error: expected ‘)’ before ‘int’ /tmp/vmware-root/modules/vmhgfs-only/tcp.c:59:33: error: expected ‘)’ before ‘int’ make[2]: *** [/tmp/vmware-root/modules/vmhgfs-only/tcp.o] Error 1 make[1]: *** [_module_/tmp/vmware-root/modules/vmhgfs-only] Error 2 make[1]: Leaving directory `/usr/src/linux-headers-3.2.0-22-generic' make: *** [vmhgfs.ko] Error 2 make: Leaving directory `/tmp/vmware-root/modules/vmhgfs-only' The filesystem driver (vmhgfs module) is used only for the shared folder feature. The rest of the software provided by VMware Tools is designed to work independently of this feature. Let me know if anyone has encountered and solved this problem. Regards, Dipen Patel

    Read the article

  • make-like build tools for data?

    - by miku
    Make is a standard tools for building software. But make decides whether a target needs to be regenerated by comparing file modification times. Are there any proven, preferably small tools that handle builds not for software but for data? Something that regenerates targets not only on mod times but on certain other properties (e.g. completeness). (Or alternatively some paper that describes such a tool.) As illustration: I'd like to automate the following process: get data (e.g. a tarball) from some regularly updated source copy somewhere if it's not there (based e.g. on some filename-scheme) convert the files to different format (but only if there aren't successfully converted ones there - e.g. from a previous attempt - custom comparison routine) for each file find a certain data element and fetch some additional file from say an URL, but only if that hasn't been downloaded yet (decide on existence of file and file "freshness") finally compute something (e.g. word count for something identifiable and store it in the database, but only if the DB does not have an entry for that exact ID yet) Observations: there are different stages each stage is usually simple to compute or implement in isolation each stage may be simple, but the data volume may be large each stage may produce a few errors each stage may have different signals, on when (re)processing is needed Requirements: builds should be interruptable and idempotent (== robust) when interrupted, already processed objects should be reused to speedup the next run data paths should be easy to adjust (simple syntax, nothing new to learn, internal dsl would be ok) some form of dependency graph, that describes the process would be nice for later visualizations should leverage existing programs, if possible I've done some research on make alternatives like rake and have worked a lot with ant and maven in the past. All these tools naturally focus on code and software build, not on data builds. A system we have in place now for a task similar to the above is pretty much just shell scripts, which are compact (and are a ok glue for a variety of other programs written in other languages), so I wonder if worse is better?

    Read the article

  • Tools of the Trade

    - by Ajarn Mark Caldwell
    I got pretty excited a couple of days ago when my new laptop arrived. “The new phone books are here!  The new phone books are here!  I’m a somebody!” - Steve Martin in The Jerk It is a Dell Precision M4500 with an Intel i7 Core 2.8 GHZ running 64-bit Windows 7 with a 15.6” widescreen, 8 GB RAM, 256 GB SSD.  For some of you high fliers, this may be nothing to write home about, but compared to the 32–bit Windows XP laptop with 2 GB of RAM and a regular hard disk that I’m coming from, it’s a really nice step forward.  I won’t even bore you with the details of the desktop PC I was first given when I started here 5 1/2 years ago.  Let’s just say that things have improved.  One really nice thing is that while we are definitely running a lean and mean department in terms of staffing, my boss believes in supporting that lean staff with good tools in order to stay lean instead of having to spend even more money on additional employees.  Of course, that only goes so far, and at some point you have to add more people in order to get more work done, which is why we are bringing on-board a new employee and a new contract developer next week.  But that’s a different story for a different time. But the main topic for this post is to highlight the variety of tools that I use in my job and that you might find useful, too.  This is easy to do right now because the process of building up my new laptop from scratch has forced me to assemble a list of software that had to be installed and configured.  Keep in mind as you look through this list that I play many roles in our company.  My official title is Software Engineering Manager, but in addition to managing the team, I am also an active ASP.NET and SQL developer, the Database Administrator, and 50% of the SAN Administrator team.  So, without further ado, here are the tools and some comments about why I use them: Tool Purpose Virtual Clone Drive Easily mount an ISO image as a DVD Drive.  This is particularly handy when you are downloading disk images from Microsoft for your tools. SQL Server 2008 R2 Developer Edition We are migrating all of our active systems to SQL 2008 R2.  Developer Edition has all the features of Enterprise Edition, but intended for development use. SQL Server 2005 Developer Edition (BIDS ONLY) The migration to SSRS 2008 R2 is just getting started, and in the meantime, maintenance work still has to be done on the reports on our SQL 2005 server.  For some reason, you can’t use BIDS from 2008 to write reports for a 2005 server.  There is some different format and when you open 2005 reports in 2008 BIDS, it forces you to upgrade, and they can no longer be uploaded to a 2005 server.  Hopefully Microsoft will fix this soon in some manner similar to Visual Studio now allows you to pick which version of the .NET Framework you are coding against. Visual Studio 2010 Premium All of our application development is in ASP.NET, and we might as well use the tool designed for it. I’ve used a version of Visual Studio going all the way back to VB 6.0 and Visual Interdev. Vault Professional Client Several years ago we replaced Visual Source Safe with SourceGear Vault (then Fortress, and now Vault Pro), and I love it.  It is very reliable with low overhead - perfect for a small to medium size development team.  And being a small ISV, their support is exceptional. Red-Gate Developer Bundle with the SQL Source Control update for Vault I first used, and fell in love with, SQL Prompt shortly before Red-Gate bought it, and then Red-Gate’s first release made me love it even more.  SQL Refactor (which has since been rolled into the latest version of SQL Prompt) has saved me many hours and migraine’s trying to understand somebody else’s code when their indenting was nonexistant, or worse, irrational.  SQL Compare has been awesome for troubleshooting potential schema issues between different instances of system databases.  SQL Data Compare helped us identify the cause behind a bug which appeared in PROD but could not be reproduced in a nearly (but not quite exactly) identical copy in UAT.  And the newest tool we are embracing: SQL Source Control.  I blogged about it here (and here, and here) last December.  This is really going to help us keep each developer’s copy of the database in sync with one another. Fiddler Helps you watch the whole traffic stream on web visits.  Haven’t used it a lot, but it did help me track down some odd 404 errors we were finding in our own application logs.  Has some other JavaScript troubleshooting capabilities, but some of its usefulness has been supplanted by the Developer Tools option in IE8. Funduc Search & Replace Find any string anywhere in a mound of source code really, really fast.  Does RegEx searches, if you understand that foreign language.  Has really helped with some refactoring work to pinpoint, for example, everywhere a particular stored procedure is referenced, whether in .NET code or other SQL procedures (which we have in script files).  Provides in-context preview of the search results.  Fantastic tool, and a bargain price. SciTE SciTE is a Scintilla based Text Editor and it is a fantastic, light-weight tool for quickly reviewing (or writing) program code, SQL scripts, and extract files.  It has language-specific syntax highlighting.  I used it to write several batch and CMD programs a year ago, and to examine data extract files for exchanging information with other systems.  Extremely handy are the options to View End of Line and View Whitespace.  Ever receive a file that is supposed to use CRLF as an end-of-line marker, but really only has CRs?  SciTE will quickly make that visible. Infragistics Controls We do a lot of ASP.NET development, and frequently use the WebGrid, WebTab, and date picker controls.  We will likely be implementing the Hierarchical Data Grid soon.  Infragistics has control suites for WebForms, WinForms, Silverlight, and coming soon MVC/JQuery. WinZip - WITH Command-Line add-in The classic compression program with a great command-line interface that allows me to build those CMD (and soon PowerShell) programs for automated compression jobs.  Our versioned Build packages are zip files. XML Notepad Haven’t used this a lot myself, but one of my team really likes it for examining large XML files. LINQPad Again, haven’t used this one a lot, but it was recommended to me for learning and practicing my LINQ skills which will come in handy as we implement Entity Framework. SQL Sentry Plan Explorer SQL Server Show Plan on steroids.  Great for helping you focus on the parts of a large query that are of most importance.  Also great for just compressing the graphical plan into more readable layout. Araxis Merge A great DIFF and Merge tool.  SourceGear provides a great tool called DiffMerge that we use all the time, but occasionally, I like the cross-edit capabilities of Araxis Merge.  For a while, we also produced DIFF reports in HTML that showed all the changes that occurred between two releases.  This was most important when we were putting out very small, but very important hot fixes on a very politically hot system.  The reports produced by Araxis Merge gave the Director of IS assurance that we were not accidentally introducing ripples throughout the system with our releases. Idera SQL Admin Toolset A great collection of tools including a password checker to help analyze your SQL Server for weak user passwords, a Backup Status tool to quickly scan a large list of servers and databases to identify any that are overdue for backups.  Particularly helpful for highlighting new databases that have been deployed without getting included in your backup processing.  I also like Space Analyzer to keep an eye on disk space consumed by database files. Idera SQL Job Manager This free tool provides a nice calendar view of SQL Server Job Schedules, but to a degree, you also get what you pay for.  We will be purchasing SQL Sentry Event Manager later this year as an even better job schedule reviewer/manager.  But in the meantime, this at least gives me a good view on potential resource conflicts across multiple instances of SQL Server. DBFViewer 2000 I inherited a couple of FoxPro databases that I have to keep an eye on occasionally and have not yet been able to migrate them to SQL Server. Balsamiq Mockups We are still in evaluation-mode on this tool, but I really like it as a quick UI mockup tool that does not require Visual Studio, so someone other than a programmer can do UI design.  The interface looks hand-drawn which definitely has some psychological benefits when communicating to users, too. FeedDemon I have to stay on top of my WAY TOO MANY blog subscriptions somehow.  I may read blogs on a couple of different computers, and FeedDemon’s integration with Google Reader allows me to keep them all in sync.  I don’t particularly like the Google Reader interface, or the fact that it always wanted to mark articles as read just because I scrolled past them.  FeedDemon solves this problem for me, and provides a multi-tabbed interface which is good because fairly frequently one blog will link to something else I want to read, and I can end up with a half-dozen open tabs all from one article. Synergy+ In my office, I run four monitors across two computers all with one mouse and keyboard.  Synergy is the magic software that makes this work. TweetDeck I’m not the most active Tweeter in the world, but when I want to check-in with the Twitterverse, this really helps.  I have found the #sqlhelp and #PoshHelp hash tags particularly useful, and I also have columns setup to make it easy to monitor #sqlpass, #PASSProfDev, and short term events like #sqlsat68.   Whew!  That’s a lot.  No wonder it took me a couple of days to get everything setup the way I wanted it.  Oh, that and actually getting some work accomplished at the same time.  Anyway, I know that is a huge dump of info, and most people never make it here to the end, so for those who did, let me say, CONGRATULATIONS, you made it! I hope you’ll find a new tool or two to make your work life a little easier.

    Read the article

  • Application Lifecycle Management Tools

    - by John K. Hines
    Leading a team comprised of three former teams means that we have three of everything.  Three places to gather requirements, three (actually eight or nine) places for customers to submit support requests, three places to plan and track work. We’ve been looking into tools that combine these features into a single product.  Not just Agile planning tools, but those that allow us to look in a single place for requirements, work items, and reports. One of the interesting choices is Software Planner by Automated QA (the makers of Test Complete).  It's a lovely tool with real end-to-end process support.  We’re probably not going to use it for one reason – cost.  I’m sure our company could get a discount, but it’s on a concurrent user license that isn’t cheap for a large number of users.  Some initial guesswork had us paying over $6,000 for 3 concurrent users just to get started with the Enterprise version.  Still, it’s intuitive, has great Agile capabilities, and has a reputation for excellent customer support. At the moment we’re digging deeper into Rational Team Concert by IBM.  Reading the docs on this product makes me want to submit my resume to Big Blue.  Not only does RTC integrate everything we need, but it’s free for up to 10 developers.  It has beautiful support for all phases of Scrum.  We’re going to bring the sales representative in for a demo. This marks one of the few times that we’re trying to resist the temptation to write our own tool.  And I think this is the first time that something so complex may actually be capably provided by an external source.   Hooray for less work! Technorati tags: Scrum Scrum Tools

    Read the article

  • Which tools you use for development in your company?…Please be exact [closed]

    - by predrag.music
    If you are a professional php/(my/postgre/?)sql/? developer and working in a professional team ... I would like to know which tools you use for development in your company. I do not care which tool is better or worse, but "which tools you use", if it is not a TOP SECRET :) For example, these are just some of the tools i/we use (first those used most (in general)): Pen, paper lots of cofee, cola ... let me think ... mmmm ... yeah more cofee :) All kinds of books (a lots of books) OS: Win / MacOS X Server: Hosted (CentOS )/ At work Mac OS X Dev server: XAMPP / MAMP / LAMP Editor: Notepad++ IDE: Netbeans / Zend Studio / Eclipse Version Control System: Mercurial / SVN FTP: Filezilla mostly / ... Passwords: KeePass js / ajax: jQuery / pure js / jQuery UI Framework:CI / Zend / pure php Database: MySQL / Other ORM: Framework layer db (Not an ORM I know but...) / Doctrine (2) / no ORM Debugging: Xdebug (PHP) / firebug (ajax/js/html/css/...) / framework profiler (stuff) / ... (x) Dreaming: About... Thinking: Not about chaos in ? direction .... n Anything else that comes to mind n+1 Zilion other stuff i know but i can't remember ... 8 some other stuff i (don't) remember i forgot, give up, delete, lost, said to myself never again, i haven't had time stuff, have on computer stuff but can't find or don't even know i have it on my computer at least 2-3 or more times, stuff I said to myself i'll check later and never checked again for all sort of "perfectly justified" reasons (time, memory, wife :), whatever,...), ... what is the reason i'm asking this?:) 8 and beyond looking forward to see a lot of answers ?

    Read the article

  • Code Measuring and Metrics Tools?

    - by David
    I'm in the process of setting up a build server for personal projects. This server will handle all the normal CI stuff, including running large suites of tests (unit, integration, automated UI). While I'm working out the kinks for including code coverage output with MSTest, it occurs to me that there may be lots of tools out there which give me additional metrics other than just code coverage. FxCop comes to mind as an example. Though I'm sure there are others. Anything that can generate useful reportable data and metrics would be good. Whether it's class dependency charts (looking for Law of Demeter violations, for example), analyses of the uses of classes/functions (looking for a function that isn't used in the system other than just the tests, for example), and so on. I'm not sure the right way to formulate the question, since polling questions or "What's your favorite code analysis tool" aren't very good. But I'm essentially just looking for recommendations on what metrics to gather and the tools that can gather them. The eventual vision for something like this is to have the CI server run a bunch of automated tests and analysis tools and track performance metrics over time. Imagine a dashboard full of graphs plotting these metrics over time. The lines should all relatively be at an equilibrium, and if one starts to stray toward the negative then it's an early indication of problems with the code. In the age old struggle to quantify code quality with management, this sounds like a potentially helpful means of doing just that.

    Read the article

  • Visual Studio 2010 Modeling and Architecture Tools

    - by MikeParks
    Jennifer Marsman (Microsoft Evangelist) and Cameron Skinner (Microsoft Visual Studio Product Unit Manager) recently stopped by our office while they were passing through Louisville on their tour to give us a presentation on the new Visual Studio 2010 Modeling and Architecture Tools. I checked out these new features when Visual Studio 2010 Beta versions originally rolled out and have been really impressed with this stuff ever since then. So it was pretty cool to actually learn some new techniques from Cameron himself since he helped write the actual code behind some of those features. If you've upgraded to Visual Studio 2010 recently I would highly recommend using the Architecture tools. They're awesome. If you want to make improvements to it, they even have their own SDK for it. There are plenty of blogs out there to show you how to use it. I've been waiting to find a tool that works like this where I can really analyze the code in solutions and projects and see how everything ties together. It's really handy if you're asked to work on a new project and aren't familiar with how it works. Just run the tools, analyze the DLL's, learn how everything works, and then you'll be ready to implement new code! It's a great tool to learn new systems quick and easy and it's all housed within the Visual Studio IDE. I just wanted to write a blog to brag about it a little bit, so I figured I'd throw this up here. It's a must have tool for Developers/Architects. Here's some screenshots of when I was using it earlier:   Thanks everyone! - Mike

    Read the article

  • YouTube: Chrome Dev Tools Integration with NetBeans IDE!

    - by Geertjan
    Some time ago my colleague David Konecny discussed the question "What works better for you? NetBeans IDE or Chrome Developer Tools?". It's a good read. David highlights the point that it's not a question of either/or but both, since the two tools are like the apple/pear dichotmoy. However, good news! The two worlds are not divided in NetBeans IDE 7.4. Changes you make in Chrome Developer Tools (CDT) are automatically persisted to the related files in NetBeans IDE, as you can see in a new YouTube clip I made today. The new integration of CDT with NetBeans IDE has been mentioned in the NetBeans IDE 7.4 New & Noteworthy, while on Twitter this was sighted yesterday: Watch the movie above and within 5 minutes you too will see the simplicity and power of CDT integration with NetBeans IDE. In other news. I consider the above to be my favorite (though it's a tough choice, since there are so many new features in NetBeans IDE 7.4) new feature, for the article "What is your favorite new NetBeans IDE 7.4 feature?"

    Read the article

  • What is the value of workflow tools?

    - by user16549
    I'm new to Workflow developement, and I don't think I'm really getting the "big picture". Or perhaps to put it differently, these tools don't currently "click" in my head. So it seems that companies like to create business drawings to describe processes, and at some point someone decided that they could use a state machine like program to actually control processes from a line and boxes like diagram. Ten years later, these tools are huge, extremely complicated (my company is currently playing around with WebSphere, and I've attended some of the training, its a monster, even the so called "minimalist" versions of these workflow tools like Activiti are huge and complicated although not nearly as complicated as the beast that is WebSphere afaict). What is the great benefit in doing it this way? I can kind of understand the simple lines and boxes diagrams being useful, but these things, as far as I can tell, are visual programming languages at this point, complete with conditionals and loops. Programmers here appear to be doing a significant amount of work in the lines and boxes layer, which to me just looks like a really crappy, really basic visual programming language. If you're going to go that far, why not just use some sort of scripting language? Have people thrown the baby out with the bathwater on this? Has the lines and boxes thing been taken to an absurd level, or am I just not understanding the value in all this? I'd really like to see arguments in defense of this by people that have worked with this technology and understand why its useful. I don't see the value in it, but I recognize that I'm new to this as well and may not quite get it yet.

    Read the article

  • SQLAuthority News – Social Media Series – Facebook and Google+

    - by pinaldave
    Pinal on Facebook and Google+ Unless you have been living under a rock for the last few years, you know that Facebook is the first and last word in social networking.  Everyone has a Facebook account – from your local store to the 10-year old school child.  Because of this ability to be completely connected to everyone in your entire life, keeping a Facebook page for a professional business can be tricky. For the most part, I use Facebook strictly for personal matters.  I am friends only with friends I know in the “real” world (as opposed to my “virtual” online friends) and with family, of course.  I chat with friends on Facebook and upload personal photos to share with family who are far away.  I hope this doesn’t make readers from my professional life feel left out.  You can follow me on Facebook at www.facebook.com/SQLAuth, but you should know that Twitter is probably the better place to find updates about SQL Server and my blog (you can follow me on Twitter at www.twitter.com/pinaldave). There are definitely businesses who keep in touch with their clients using Facebook, but I felt the need to keep my personal and professional life separate.  That’s why I was so excited to find out Google was coming out with their own social media site, Google+.  On Google+ I post some personal things as well, and there is a lot of overlap between what I put on Facebook and what I put on Google+.  But since Google+ has become so popular amongst the “techie” crowd, I have found that it’s a good place to follow some of the stars of the Microsoft world, like Scott Hanselman and Buck Woody. If you are also a member of Google+, I am looking to expand my circle there.  You can find me at https://plus.google.com/104990425207662620918/posts.  Google+ is the newest face in the social media world, and it still hasn’t found a good footing between personal and professional yet.  That’s why I felt it would be a good idea to jump on the site early and help them determine which way to go.  Maybe someday it will be a place where business and personal can mix. Reference: Pinal Dave (http://blog.sqlauthority.com) Filed under: PostADay, SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, T SQL, Technology Tagged: Social Media

    Read the article

  • Android 5.0 (ou 4.1) dévoilé ce soir au Google I/O, la statue de « Jelly Beans » a été installée dans les jardins de Google

    Android 5.0 ou 4.1 dévoilé ce soir au Google I/O La statue de « Jelly Beans » a été installée dans les jardins de Google C'est donc quasi-officiel, la prochaine version d'Android sera présentée ce soir lors du Google I/O, la conférence annuelle de Google dédiée aux développeurs. La représentation du nouveau dessert qui sert de surnom à cette version vient en effet d'être installée dans les jardins du siège social de Google. Sa photo a été publiée hier soir sur un des Google+ officiels de l'éditeur : [IMG]http://ftp-developpez.com/gordon-fowler/Jelly%20Beans%20Garden.jpg[/IMG]Les Jelly Beans (ou « bonbon haricot ») sont l'équivalent américain des Dragibus.

    Read the article

  • Why are Awstats, Webalizer, and Google Analytics results so different?

    - by Matt
    I realize that comparing Awstats and Webalizer to Google Analytics is like comparing apples to oranges, but each of them track at least basic statistics about visitors and pages. So why are there often very significant differences in their data? For example, comparing Analytics with Awstats using some numbers from a small site over the past week: Awstats 78 unique visitors 205 visits (2.62 visits/visitor) 1,072 pages (5.22 pages/visit) Google Analytics 115 unique visitors 240 visits (2.08 visits/visitor) 1,275 pages (5.31 pages/visit) They're similar on the number of visits, but page views and uniques are quite different. I'm familiar with discrepancies of a much higher magnitude on some larger sites, showing that this trend scales proportionally upward. What is the reason behind the different numbers, even when the data is quite trivial like unique visitors and page loads?

    Read the article

< Previous Page | 106 107 108 109 110 111 112 113 114 115 116 117  | Next Page >