Search Results

Search found 25148 results on 1006 pages for 'distributed source contr'.

Page 583/1006 | < Previous Page | 579 580 581 582 583 584 585 586 587 588 589 590  | Next Page >

  • Allen for Umbraco with location EXIF meta data

    - by Vizioz Limited
    The latest version of Allen for Umbraco has now hit the Apple App store, we have managed to add some nice improvements to this version that include:Storing location and direction information when photos are taken within the AppEmbedding EXIF data into the images when they are uploadBackground UploadingPull to refresh the media tree Location and DirectionBy default when the camera is used within an application the location and direction that the camera is pointing is not stored within the image meta data. We have now added full support so that this data is now added. We have added a setting which allows you to prevent this data from being uploaded to your website if you do not want the location data to be sent you can turn it off within Allen, Note: Please don't forget that location services do need to be turned on to allow the app to access the images in the phone's asset library.We have had quite a few ideas from users already for using this location data, including logging free parking in Denmark to geo-tagging holiday photos and linking the photos to Google street view. Embedding EXIF dataWe now embed all the meta data available on the iPhone into the image when it is uploaded to your server, this allows you to pull the data out and use it within your site. Have a look at Cultiv's Photo Meta Data package for great example code that allows you to automatically pull this data out and populate properties on your Umbraco media item.We slightly modified the source code of this package to allow the package to always extract the image data, as the default package requires a property to allow the data to be extracted, it's an easy change, if you get stuck add a comment to this post. Background UploadingIf you try to upload multiple images and need to start doing something else on your phone, you can now click the home button and the application will continue to upload your images in the background. As soon as it has finished you will receive a standard Apple notification. Pull to RefreshOur final enhancement has been to add "Pull to refresh" to the media trees, just pull the tree downwards with your finger and it will refresh, this is useful if you are adding items to your media tree while testing your site with Allen for Umbraco. Future enhancements.. your ideas?If you have any ideas for future enhancement feel free to add a comment below!

    Read the article

  • ASP.NET 4.0- Html Encoded Expressions

    - by Jalpesh P. Vadgama
    We all know <%=expression%> features in asp.net. We can print any string on page from there. Mostly we are using them in asp.net mvc. Now we have one new features with asp.net 4.0 that we have HTML Encoded Expressions and this prevent Cross scripting attack as we are html encoding them. ASP.NET 4.0 introduces a new expression syntax <%: expression %> which automatically convert string into html encoded. Let’s take an example for that. I have just created an hello word protected method which will return a simple string which contains characters that needed to be HTML Encoded. Below is code for that. protected static string HelloWorld() { return "Hello World!!! returns from function()!!!>>>>>>>>>>>>>>>>>"; } Now let’s use the that hello world in our page html like below. I am going to use both expression to give you exact difference. <form id="form1" runat="server"> <div> <strong><%: HelloWorld()%></strong> </div> <div> <strong><%= HelloWorld()%></strong> </div> </form> Now let’s run the application and you can see in browser both look similar. But when look into page source html in browser like below you can clearly see one is HTML Encoded and another one is not. That’s it.. It’s cool.. Stay tuned for more.. Happy Programming Technorati Tags: ASP.NET 4.0,HTMLEncode,C#4.0

    Read the article

  • Big Data – Buzz Words: Importance of Relational Database in Big Data World – Day 9 of 21

    - by Pinal Dave
    In yesterday’s blog post we learned what is HDFS. In this article we will take a quick look at the importance of the Relational Database in Big Data world. A Big Question? Here are a few questions I often received since the beginning of the Big Data Series - Does the relational database have no space in the story of the Big Data? Does relational database is no longer relevant as Big Data is evolving? Is relational database not capable to handle Big Data? Is it true that one no longer has to learn about relational data if Big Data is the final destination? Well, every single time when I hear that one person wants to learn about Big Data and is no longer interested in learning about relational database, I find it as a bit far stretched. I am not here to give ambiguous answers of It Depends. I am personally very clear that one who is aspiring to become Big Data Scientist or Big Data Expert they should learn about relational database. NoSQL Movement The reason for the NoSQL Movement in recent time was because of the two important advantages of the NoSQL databases. Performance Flexible Schema In personal experience I have found that when I use NoSQL I have found both of the above listed advantages when I use NoSQL database. There are instances when I found relational database too much restrictive when my data is unstructured as well as they have in the datatype which my Relational Database does not support. It is the same case when I have found that NoSQL solution performing much better than relational databases. I must say that I am a big fan of NoSQL solutions in the recent times but I have also seen occasions and situations where relational database is still perfect fit even though the database is growing increasingly as well have all the symptoms of the big data. Situations in Relational Database Outperforms Adhoc reporting is the one of the most common scenarios where NoSQL is does not have optimal solution. For example reporting queries often needs to aggregate based on the columns which are not indexed as well are built while the report is running, in this kind of scenario NoSQL databases (document database stores, distributed key value stores) database often does not perform well. In the case of the ad-hoc reporting I have often found it is much easier to work with relational databases. SQL is the most popular computer language of all the time. I have been using it for almost over 10 years and I feel that I will be using it for a long time in future. There are plenty of the tools, connectors and awareness of the SQL language in the industry. Pretty much every programming language has a written drivers for the SQL language and most of the developers have learned this language during their school/college time. In many cases, writing query based on SQL is much easier than writing queries in NoSQL supported languages. I believe this is the current situation but in the future this situation can reverse when No SQL query languages are equally popular. ACID (Atomicity Consistency Isolation Durability) – Not all the NoSQL solutions offers ACID compliant language. There are always situations (for example banking transactions, eCommerce shopping carts etc.) where if there is no ACID the operations can be invalid as well database integrity can be at risk. Even though the data volume indeed qualify as a Big Data there are always operations in the application which absolutely needs ACID compliance matured language. The Mixed Bag I have often heard argument that all the big social media sites now a days have moved away from Relational Database. Actually this is not entirely true. While researching about Big Data and Relational Database, I have found that many of the popular social media sites uses Big Data solutions along with Relational Database. Many are using relational databases to deliver the results to end user on the run time and many still uses a relational database as their major backbone. Here are a few examples: Facebook uses MySQL to display the timeline. (Reference Link) Twitter uses MySQL. (Reference Link) Tumblr uses Sharded MySQL (Reference Link) Wikipedia uses MySQL for data storage. (Reference Link) There are many for prominent organizations which are running large scale applications uses relational database along with various Big Data frameworks to satisfy their various business needs. Summary I believe that RDBMS is like a vanilla ice cream. Everybody loves it and everybody has it. NoSQL and other solutions are like chocolate ice cream or custom ice cream – there is a huge base which loves them and wants them but not every ice cream maker can make it just right  for everyone’s taste. No matter how fancy an ice cream store is there is always plain vanilla ice cream available there. Just like the same, there are always cases and situations in the Big Data’s story where traditional relational database is the part of the whole story. In the real world scenarios there will be always the case when there will be need of the relational database concepts and its ideology. It is extremely important to accept relational database as one of the key components of the Big Data instead of treating it as a substandard technology. Ray of Hope – NewSQL In this module we discussed that there are places where we need ACID compliance from our Big Data application and NoSQL will not support that out of box. There is a new termed coined for the application/tool which supports most of the properties of the traditional RDBMS and supports Big Data infrastructure – NewSQL. Tomorrow In tomorrow’s blog post we will discuss about NewSQL. Reference: Pinal Dave (http://blog.sqlauthority.com) Filed under: Big Data, PostADay, SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, T SQL

    Read the article

  • Event 4098, 0x80070533 Logon failure: account currently disabled?

    - by Windos
    Having started to upgrade our PCs to Windows 7 we have noticed that we are getting group policy warnings in Event Viewer such as: "The user 'Word.qat' preference item in the 'a_Office2007_Users {A084A37B-6D4C-41C0-8AF7-B891B87FC53B}' Group Policy object did not apply because it failed with error code '0x80070533 Logon failure: account currently disabled.' This error was suppressed." 15 of these warnings appear every two hours on every Windows 7 PC, most of which are to do with core office applications and two are for plug-ins to out document management system. These warnings aren't afecting the users, but it would be nice to track down the source of them before we rollout Win7 to the rest of the Organisation. Any ideas as to where the login issue could be comming from (All users are connecting to the domain and proxy, etc fine)?

    Read the article

  • Path is too long

    - by kaleidoscope
    Bugged by the irritating "Path is too long after being fully qualified" error while running in the Development Fabric? The solution is pretty funny and not so obvious unfortunately. The culprit here is not your app, but the Development Fabric. The DevFab accumulates a lot of temporary junk comprising of local storage locations, cached binaries, configuration, diagnostics information and cached compiled web site content files over its lifetime. They are typically stored at C:\Users\<username>\AppData\Local\dftmp. The Azure Tools will periodically clean this up, but some time you have to play janitor and take the law in your hands ;). The csrun.exe has quite a few tricks up its sleeve. One of them is the ability to clean the development fabric's temporary junk accumulated over time. You can do this by  running the Azure command prompt with elevated privileges and running csrun.exe /devfabric:shutdown and then csrun.exe /devfabric:clean If the problem still persists then the application directory structure could indeed be too long. A workaround to this is changing the Development Fabric temporary directory to point to a shorter path. The temporary directory path can be addressed by an environment variable _CSRUN_STATE_DIRECTORY. You can try setting its value to something like "C:\WA" or "C:\A" this will reduce some 25+ characters from your path. Do not forget to close Visual Studio and expressly shutdown the dev fab with csrun.exe /devfabric:shutdown (Under elevated privileges of course). Source: http://geekswithblogs.net/IUnknown/archive/2010/02/03/no-more-path-is-too-long.aspx  :D   Sarang, K

    Read the article

  • Need some advice regarding collision detection with the sprite changing its width and height

    - by Frank Scott
    So I'm messing around with collision detection in my tile-based game and everything works fine and dandy using this method. However, now I am trying to implement sprite sheets so my character can have a walking and jumping animation. For one, I'd like to to be able to have each frame of variable size, I think. I want collision detection to be accurate and during a jumping animation the sprite's height will be shorter (because of the calves meeting the hamstrings). Again, this also works fine at the moment. I can get the character to animate properly each frame and cycle through animations. The problems arise when the width and height of the character change. Often times its position will be corrected by the collision detection system and the character will be rubber-banded to random parts of the map or even go outside the map bounds. For some reason with the linked collision detection algorithm, when the width or height of the sprite is changed on the fly, the entire algorithm breaks down. The solution I found so far is to have a single width and height of the sprite that remains constant, and only adjust the source rectangle for drawing. However, I'm not sure exactly what to set as the sprite's constant bounding box because it varies so much with the different animations. So now I'm not sure what to do. I'm toying with the idea of pixel-perfect collision detection but I'm not sure if it would really be worth it. Does anyone know how Braid does their collision detection? My game is also a 2D sidescroller and I was quite impressed with how it was handled in that game. Thanks for reading.

    Read the article

  • Are you aware of .NET Reflector Pro?

    I'm sure many of my readers know Reflector, that tool to decompile the assemblies to see what it contains, maybe investigating what Microsoft has done with the base assemblies in .NET or maybe trying to understand 3rd party assemblies (or maybe just trying to recover the lost source code ;-) ) It's invaluable tool to have in your tool box. One nice scenario where it helps a lot is Sharepoint development in case you are in problems with the API. But are you aware that MS gave the product to Red Gate Software (http://www.red-gate.com) which released a Pro version of Reflector (http://www.red-gate.com/products/reflector/index.htm) a couple of months ago? Have a look at the feature set on top of the free version.Full support for .NET 1.0, 1.1, 2.0, 3.0, 3.5, and 4.0Decompile an entire assembly to either C# or VB to view and debug in Visual Studio Step-through debugging of any assembly in Visual Studio (as long as it's not obfuscated): Step into and set breakpoints anywhere in any assemblyWatch variables in the decompiled codeUse Visual Studio's advanced debugging features in decompiled code: Set Next Statement, modify variable values, and dynamic expression evaluation in the immediate window I strongly encourage you to have a look at .NET Reflector in case you haven't done so already. Share this post: email it! | bookmark it! | digg it! | reddit! | kick it! | live it!Did you know that DotNetSlackers also publishes .net articles written by top known .net Authors? We already have over 80 articles in several categories including Silverlight. Take a look: here.

    Read the article

  • Which Linux distro for Mac Mini?

    - by spoon16
    I recently received a Mac Mini and would like to set it up as a web server and git source server. I would like to learn Linux so am interested in setting up my Mac Mini with Linux instead of OSX. Here are the main things that I will be using the Mac Mini for. git Repositories (via Gitosis) build server (build projects in git repositories using commit hooks and run tests) simple websites (PHP) learning C++ in a non-Windows environment What distribution would you recommend? Please provide some detail in your answer so that I can make an meaningful decision. Because I am looking to use the mini as more of a server than a normal desktop machine I was thinking of Ubuntu Server, I'm not sure if that is over kill though given the hardware I am using.

    Read the article

  • Which Linux distro for Mac Mini?

    - by spoon16
    I recently received a Mac Mini and would like to set it up as a web server and git source server. I would like to learn Linux so am interested in setting up my Mac Mini with Linux instead of OSX. Here are the main things that I will be using the Mac Mini for. git Repositories (via Gitosis) build server (build projects in git repositories using commit hooks and run tests) simple websites (PHP) learning C++ in a non-Windows environment What distribution would you recommend? Please provide some detail in your answer so that I can make an meaningful decision. Because I am looking to use the mini as more of a server than a normal desktop machine I was thinking of Ubuntu Server, I'm not sure if that is over kill though given the hardware I am using.

    Read the article

  • How to upgrade ClamAV on Ubuntu Hardy Heron 8.04 LTS?

    - by Jordan Lev
    I'm running a server on Ubuntu Hardy Heron 8.04 LTS, and when I installed ClamAV via aptitude, it installed version 0.94. That version has now been EOL'ed, but when I run "aptitude upgrade", it doesn't update ClamAV to the more recent version (0.96). I then followed these instructions on Installing ClamAV from the PPA, but when I did that, I get a message saying "The following packages have been kept back: ... clamav clamav-base clamav-daemon clamav-freshclam ..." Does anyone know how to get Ubuntu 8.04 to do this update via aptitude or apt-get (I'm hoping to avoid having to compile from source, etc.)?

    Read the article

  • How to use SharePoint modal dialog box to display Custom Page Part2

    - by ybbest
    In the first part of the series, I showed you how to display and close a custom page in a SharePoint modal dialog using JavaScript. In this one, I’d like to show you how to display some information after the Modal dialog is closed.You can download the source code here. 1. Firstly, modify the element file as follow <Elements xmlns="http://schemas.microsoft.com/sharepoint/"> <CustomAction Id="ReportConcern" RegistrationType="ContentType" RegistrationId="0x010100866B1423D33DDA4CA1A4639B54DD4642" Location="EditControlBlock" Sequence="107" Title="Display Custom Page" Description="To Display Custom Page in a modal dialog box on this item"> <UrlAction Url="javascript: function emitStatus(messageToDisplay) { statusId = SP.UI.Status.addStatus(messageToDisplay.message + ' ' +messageToDisplay.location ); SP.UI.Status.setStatusPriColor(statusId, 'Green'); } function portalModalDialogClosedCallback(result, value) { if (value !== null) { emitStatus(value); } } var options = { url: '{SiteUrl}' + '/_layouts/YBBEST/TitleRename.aspx?List={ListId}&amp;ID={ItemId}', title: 'Rename title', allowMaximize: false, showClose: true, width: 500, height: 300, dialogReturnValueCallback: portalModalDialogClosedCallback }; SP.UI.ModalDialog.showModalDialog(options);" /> </CustomAction> </Elements> 2. In your code behind, you can implement a close dialog function as below. This will close your modal dialog box once the button is clicked and display a status bar. protected static string GetCloseDialogScript(string message) { var scriptBuilder = new StringBuilder(); scriptBuilder.Append("<script type='text/javascript'>" + "SP.UI.ModalDialog.commonModalDialogClose(1,").Append(message).Append("); </script>"); return scriptBuilder.ToString(); }

    Read the article

  • notepad sql Unicode and Non Unicode

    - by RBrattas
    Hi, I have a Microsoft Notepad flate file with data and Vertical Bar as column delimiter. I get following message: cannot convert between unicode and non-unicode string data types It seems it is my nvarchar(max) that creates my problem. I changed to varchar(max); but still the same problem. How do I insert my flate file into my SQL Server 2005? And in the SQL Server 2005 import and export wizard the flate file source advanced tab the OutputColumnWith is 50. Will that say my flate file column is max 50? I hope not because my column is more then 50... Thank you, Rune

    Read the article

  • Java Spotlight Episode 75: Greg Luck on JSR 107 Java Temporary Caching API

    - by Roger Brinkley
    Tweet Recorded live at Jfokus 2012, an interview with Greg Luck on JSR 107 Java Temporary Caching API. Joining us this week on the Java All Star Developer Panel is Alexis Moussine-Pouchkine, Java EE Developer Advocate. Right-click or Control-click to download this MP3 file. You can also subscribe to the Java Spotlight Podcast Feed to get the latest podcast automatically. If you use iTunes you can open iTunes and subscribe with this link:  Java Spotlight Podcast in iTunes. Show Notes News JavaOne 2012 call for papers is open (closes April 9th) LightFish, Adam Bien's lightweight telemetry application Java EE 6 sample code JavaFX 1.2 and 1.3 EOL Repeating Annotations in the Works Events March 26-29, EclipseCon, Reston, USA March 27, Virtual Developer Days - Java (Asia Pacific (English)),9:30 am to 2:00pm IST / 12:00pm to 4.30pm SGT  / 3.00pm - 7.30pm AEDT April 4-5, JavaOne Japan, Tokyo, Japan April 12, GreenJUG, Greenville, SC April 17-18, JavaOne Russia, Moscow Russia April 18–20, Devoxx France, Paris, France April 26, Mix-IT, Lyon, France, May 3-4, JavaOne India, Hyderabad, India Feature Interview Greg Luck founded Ehcache in 2003. He regularly speaks at conferences, writes and codes. He has also founded and maintains the JPam and Spnego open source projects, which are security focused. Prior to joining Terracotta in 2009, Greg was Chief Architect at Wotif.com where he provided technical leadership as the company went from a single product startup to a billion dollar public company with multiple product lines. Before that Greg was a consultant for ThoughtWorks with engagements in the US and Australia in the travel, health care, geospatial, banking and insurance industries. Before doing programming, Greg managed IT. He was CIO at Virgin Blue, Tempo Services, Stamford Hotels and Resorts and Australian Resorts. He is a Chartered Accountant, and spent 7 years with KPMG in small business and insolvency. Mail Bag What’s Cool RT @harkje: To update an earlier tweet: #JavaFX feels like Swing with added convenience methods, better looking widgets, nice effects and...

    Read the article

  • How to sysprep SQL Server Express?

    - by Jim
    We plan to deploy Hyper-V VHD with Windows Server 2008 R2 and SQL Server 2012 Express installed to multiple hosts. From my understanding, the correct way to do this is to install SQL Server in prepartion mode, sysprep Windows, then complete SQL Server installation when the VHD is deployed. I mostly followed the process in this blog post: http://sethusrinivasan.com/category/sysprep/ However, after the VHD is deployed, I'm unable to complete the SQL Server installation. It keeps saying "Upgrade matrix is incorrect". It seems that it's trying to upgrade itself to Enterprise edition (I was asked for product key during install, but I skipped it). Could anyone share their experience in deploying VHDs with SQL Server (we're fine with either SQL Server 2008 R2 or 2012)? I think the source of my issue is because I can't select "Express Edition" when entering the product key at the completion stage, so the installation is trying to do an upgrade to Enterprise Edition. I have no idea why the drop down list is empty.

    Read the article

  • Ubuntu 12.04.1 Radeon 9550 stuck with 640x480, works in Geexbox

    - by Betty
    I am a complete new user trying to set up Ubuntu on an old desktop. It has an AGP Radeon 9550 graphics card. I am running Ubuntu from a USB drive with persistence as the PC currently has no hard drive I seem to be stuck in 640*480 mode. The desktop itself is larger, but the monitor display is stuck on 640*480. In SettingsDisplays, only the 640*480 option is available. What I have found out so far: The proprietary ati drivers no longer support my card. If 3D isn't an issue (it's not) the open source driver should be fine. This should be installed by default so in theory I am using it already xserver-xconf/pci/*.ids doesn't show any entries for the card's PCI id. hardware additional drivers show no proprietary drivers I tried the booting into the current version of Geexbox from a USB stick and this set the resolution correctly by default so I know it can be done, but I know no idea how. How can I tell what driver the card is using, and how can I get the higher resolutions back?

    Read the article

  • error when start Subversion Server in UberSVN

    - by dakiquang
    I'm using uberSVN on Ubuntu server. Now, I can't checkout, commit source from Subversion Server ( domain.com:9880, port subversion) I tried to start httpd by command line as: sudo /opt/ubersvn/bin/httpdserverctl start , but I get error: httpd: Could not reliably determine the server's fully qualified domain name, using 127.0.1.1 for ServerName httpd (pid 895) already running I tried run this command as follow: cd /opt/ubersvn/bin sudo ./ubersvncontrol start then get errors: Starting SysV Tomcat Using CATALINA_BASE: /opt/ubersvn/tomcat Using CATALINA_HOME: /opt/ubersvn/tomcat Using CATALINA_TMPDIR: /opt/ubersvn/tomcat/temp Using JRE_HOME: /opt/ubersvn/jre Using CLASSPATH: /opt/ubersvn/tomcat/bin/bootstrap.jar Using CATALINA_PID: /opt/ubersvn/data/run/tomcat.pid Existing PID file found during start. Tomcat appears to still be running with PID 943. Start aborted. In the Ubersvn website GUI, I tried start Ubersvn server but get error : How to start Subversion Server ?

    Read the article

  • Problems with wireless file copy - hangs during copy

    - by Eran Kampf
    I have this problem on my home network where when I try to copy files via wireless connection, at some point the copy fails and the machine doing the copy loses connection to the network. I'm pretty much lost as to figuring out the source for the problem... Some more details that might help: It's not OS related. I have Windows 7 machines and Leopard machines and this occurs on both I'm using a 3com OfficeConnect router Wired transfer works fine Other programs that require heavy wireless traffic work just fine:streaming large HD movies, Xbox Live! ... I'm lost as to how to even begin trying to diagnose the problem so any tips are welcome...

    Read the article

  • What is ODBC?

    According to Microsoft, ODBC is a specification for a database API.  This API is database and operating system agnostic due to the fact that the primary goal of the ODBC API is to be language-independent. Additionally, the open functions of the API are created by the manufactures of DBMS-specific drivers. Developers can use these exposed functions from within their own custom applications so that they can communicate with DBMS through the language-independent drivers. ODBC Advantages Multiple ODBC drivers for each DBSM Example Oracle’s ODBC Driver Merant’s Oracle Driver Microsoft’s Oracle Driver ODBC Drivers are constantly updated for the latest data types ODBC allows for more control when querying ODBC allows for Isolation Levels ODBC Disadvantages ODBC Requires DSN ODBC is the proxy between an application and a database ODBC is dependent on third party drivers ODBC Transaction Isolation Levels are related to and limited by the transaction management capabilities of the data source. Transaction isolation levels:  READ UNCOMMITTED Data is allowed to be read prior to the committing of a transaction.  READ COMMITTED Data is only accessible after a transaction has completed  REPEATABLE READ The same data value is read during the entire transaction  SERIALIZABLE Transactions have no effect on other transactions

    Read the article

  • Retrieve the content of Microsoft Word document using OpenXml and C#

    - by ybbest
    One of the tasks involves me to retrieve the contents of Microsoft Word document (word2007 above). I try to search for some resources online with not much luck; most of the examples are for writing contents to word document using OpenXml. I decide to blog this as my reference and hopefully people who read this post will find it useful as well. To retrieve the contents of Microsoft Word document using XML is extremely simple. 1. Firstly, you need to download and install the Open XML SDK 2.0 for Microsoft Office. (Download link) 2. Create a Console application then add the DocumentFormat.OpenXml.dll and WindowsBase.dll to the project, you can find these dlls in the .NET tab of the Add Reference window. 3. Write the following code to grab the contents from the word document and display it on the console window. You can download the complete source code here. References: Getting Started with the Open XML SDK 2.0 for Microsoft Office Walkthrough: Word 2007 XML Format Word Processing How To Open XML SDK 2.0 for Microsoft Office Office Developer Center openxmldeveloper Open XML Package Explorer

    Read the article

  • Work Item Visualizer for TFS 2010 - New Extension

    - by MikeParks
    I released another new extension to the Visual Studio Gallery again today called Work Item Visualizer for TFS 2010. I've only heard positive things about it so far, hopefully it stays that way :) Basically, it creates a diagram of all work items linked to a work item ID which the user specifies in a search box. This extension was coded using DGML (the same graph rendering language used for the Visual Studio 2010 Architecture Tools). It was pretty cool getting a chance to create something using some of the newest technology out there. Well, I just wanted to throw a blog up to get the word out on it a little more. If you're using Visual Studio 2010 with Team Foundation Server 2010, feel free to check it out! Thanks everyone. Download Link: http://visualstudiogallery.msdn.microsoft.com/en-us/a35b6010-750b-47f6-a7a5-41f0fa7294d2   What it does: ·         Creates a DGML graph to visualize linked TFS Work Items by entering a Work Item ID in the toolbar search box   How it benefits you: ·         Allows you to easily analyze the hierarchy of your TFS Work Items ·         Gain the ability to perform basic risk/impact analysis when creating or editing Work Items ·         Great for meetings in the case that you need to discuss the entire scope of linked Work Items ·         Easier project planning ·         Eliminates the need to create TFS queries or reports to view tree of Work Items ·         Easily lets you see the entire tree of work items linked to the one you’re working on   Navigation Tips: ·         Use Ctrl + Mouse Wheel Scroll to zoom in and out ·         Use Ctrl + Left Mouse click (and hold) to move document around ·         Right click on DGML area for more options (Like copy image or viewing in groups) ·         Clicking on each node highlights that node and the links connected to it ·         Colors in the legend can be changed ·         When work item nodes are deleted, the view is automatically updated ·         Double clicking on work item node will open up the Work Items URL   Try it out on work items that have several of links and let us know what you think. A big thanks goes out to everyone working on the http://visualization.codeplex.com/ project for publishing the source code on CodePlex which really helped me learn how DGML (Directed Graph Markup Language - New to Visual Studio 2010 Architecture Tools) works!    - Mike

    Read the article

  • ASP.NET Web Server Hardware Configuration

    - by Santa Te Banta
    I'm planning on deploying my ASP.NET Web app in the production environment using a Windows Server 2003 machine. But I know nothing about the CPU brand names and what's best. I know 4 GB RAM, with anything over 3 GHz clock speed will be a good bet and will serve a large number of users. But tell me what's the latest and greatest processor brand-names for running a Windows Server 2003 OS today? And what edition of the Windows 2003 Server do I need out of the following, if I have to run a website to support about 100,000 (a hundred thousand) users, 60% of who are expected to be online at all times? Web Edition Standard Enterprise Datacenter source: http://en.wikipedia.org/wiki/Windows_Server_2003 The article says that the Web edition can only support up to 2 GB of RAM. Will that be sufficient for the above user population?

    Read the article

  • Getting WCF Services in a Silverlight solution to play nice on deployment

    - by brendonpage
    I have come across 2 issues with deploying WCF services in a Silverlight solution, admittedly the one is more of a hiccup, and only occurs if you take the easy way out and reference your services through visual studio. The First Issue This occurs when you deploy your WFC services to an IIS server. When browse to the services using your web browser, you are greeted with “This collection already contains an address with scheme http.  There can be at most one address per scheme in this collection.”. When you make a call to this service from your Silverlight application, you get the extremely helpful “NotFound” error, this error message can be found in the error property of the event arguments on the complete event handler for that call. As it did with me this will leave most people scratching their head, because the very same services work just fine on the ASP.NET Development Web Server and on my local IIS server. Now I’m no server/hosting/IIS expert so I did a bit of searching when I first encountered this issue. I found out this happens because IIS supports multiple address bindings per protocol (http/https/ftp … etc) per web site, but WCF only supports binding to one address per protocol. This causes a problem when the WCF service is hosted on a site with multiple address bindings, because IIS provides all of the bindings to the host factory when running the service. While this problem occurs mainly on shared hosting solutions, it is not limited to shared hosting, it just seems like all shared hosting providers setup sites on their servers with multiple address bindings. For interests sake I added functionality to the example project attached to this post to dump the addresses given to the WCF service by IIS into a log file. This was the output on the shared hosting solution I use: http://mydomain.co.za/Services/TestService.svc http://www.mydomain.co.za/Services/TestService.svc http://mydomain-co-za.win13.wadns.net/Services/TestService.svc http://win13/Services/TestService.svc As you can see all these addresses are for the http protocol, which is where it all goes wrong for WCF. Fixes for the First Issue There are a few ways to get around this. The first being the easiest, target .NET 4! Yes that's right in .NET 4 WCF services support multiple addresses per protocol. This functionality is enabled by an option, which is on by default if you create a new project, you will need to turn on if you are upgrading to .NET 4. To do this set the multipleSiteBindingsEnabled property of the serviceHostingEnviroment tag in the web.config file to true, as shown below: <system.serviceModel>     <serviceHostingEnvironment multipleSiteBindingsEnabled="true" /> </system.serviceModel> Beware this ONLY works in .NET 4, so if you don’t have a server with .NET 4 installed on that you can deploy to, you will need to employ one of the other work a rounds. The second option will work for .NET 3.5 & 4. For this option all you need to do is modify the web.config file and add baseAddressPrefixFilters to the serviceHostingEnviroment tag as shown below: <system.serviceModel>     <serviceHostingEnvironment>         <baseAddressPrefixFilters>              <add prefix="http://www.mydomain.co.za"/>         </baseAddressPrefixFilters>     </serviceHostingEnvironment> </system.serviceModel> These will be used to filter the list of base addresses that IIS provides to the host factory. When specifying these prefix filters be sure to specify filters which will only allow 1 result through, otherwise the entire exercise will be pointless. There is however a problem with this work a round, you are only allowed to specify 1 prefix filter per protocol. Which means you can’t add filters for all your environments, this will therefore add to the list of things to do before deploying or switching dev machines. The third option is the one I currently employ, it will work for .NET 3, 3.5 & 4, although it is not needed for .NET 4. For this option you create a custom host factory which inherits from the ServiceHostFactory class. In the implementation of the ServiceHostFactory you employ logic to figure out which of the base addresses, that are give by IIS, to use when creating the service host. The logic you use to do this is completely up to you, I have seen quite a few solutions that simply statically reference an index from the list of base addresses, this works for most situations but falls short in others. For instance, if the order of the base addresses where to change, it might end up returning an address that only resolves on the servers local network, like the last one in the example I gave at the beginning. Another instance, if a request comes in on a different protocol, like https, you will be creating the service host using an address which is on the incorrect protocol, like http. To reliably find the correct address to use, I use the address that the service was requested on. To accomplish this I use the HttpContext, which requires the service to operate with AspNetCompatibilityRequirements set on. If for some reason running you services with AspNetCompatibilityRequirements on isn’t an option, you can still use this method, you will just have to come up with your own logic for selecting the correct address. First you will need to enable AspNetCompatibilityRequirements for your hosting environment, to do this you will need to set it to true in the web.config file as shown below: <system.serviceModel>     <serviceHostingEnvironment AspNetCompatibilityRequirements="true" /> </system.serviceModel> You will then need to mark any services that are going to use the custom host factory, to allow AspNetCompatibilityRequirements, as shown below: [AspNetCompatibilityRequirements(RequirementsMode = AspNetCompatibilityRequirementsMode.Allowed)] public class TestService { } Now for the custom host factory, this is where the logic lives that selects the correct address to create service host with. The one i use is shown below: public class CustomHostFactory : ServiceHostFactory { protected override ServiceHost CreateServiceHost(Type serviceType, Uri[] baseAddresses) { // // Compose a prefix filter based on the requested uri // string prefixFilter = HttpContext.Current.Request.Url.Scheme + "://" + HttpContext.Current.Request.Url.DnsSafeHost; if (!HttpContext.Current.Request.Url.IsDefaultPort) { prefixFilter += ":" + HttpContext.Current.Request.Url.Port.ToString() + "/"; } // // Find a base address that matches the prefix filter // foreach (Uri baseAddress in baseAddresses) { if (baseAddress.OriginalString.StartsWith(prefixFilter)) { return new ServiceHost(serviceType, baseAddress); } } // // Throw exception if no matching base address was found // throw new Exception("Custom Host Factory: No base address matching '" + prefixFilter + "' was found."); } } The most important line in the custom host factory is the one that returns a new service host. This has to return a service host that specifies only one base address per protocol. Since I filter by the address the request came on in, I only need to create the service host with one address, since this address will always be of the correct protocol. Now you have a custom host factory you have to tell your services to use it. To do this you view the markup of the service by right clicking on it in the solution explorer and choosing “View Markup”. Then you add/set the value of the Factory property to the full namespace path of you custom host factory, as shown below. And that is it done, the service will now use the specified custom host factory. The Second Issue As I mentioned earlier this issue is more of a hiccup, but I thought worthy of a mention so I included it. This issue only occurs when you add a service reference to a Silverlight project. Visual Studio will generate a lot of code for you, part of that generated code is the ServiceReferences.ClientConfig file. This file stores the endpoint configuration that is used when accessing your services using the generated proxy classes. Here is what that file looks like: <configuration>     <system.serviceModel>         <bindings>             <customBinding>                 <binding name="CustomBinding_TestService">                     <binaryMessageEncoding />                     <httpTransport maxReceivedMessageSize="2147483647" maxBufferSize="2147483647" />                 </binding>                 <binding name="CustomBinding_BrokenService">                     <binaryMessageEncoding />                     <httpTransport maxReceivedMessageSize="2147483647" maxBufferSize="2147483647" />                 </binding>             </customBinding>         </bindings>         <client>             <endpoint address="http://localhost:49347/services/TestService.svc"                 binding="customBinding" bindingConfiguration="CustomBinding_TestService"                 contract="TestService.TestService" name="CustomBinding_TestService" />             <endpoint address="http://localhost:49347/Services/BrokenService.svc"                 binding="customBinding" bindingConfiguration="CustomBinding_BrokenService"                 contract="BrokenService.BrokenService" name="CustomBinding_BrokenService" />         </client>     </system.serviceModel> </configuration> As you will notice the addresses for the end points are set to the addresses of the services you added the service references from, so unless you are adding the service references from your live services, you will have to change these addresses before you deploy. This is little more than an annoyance really, but it adds to the list of things to do before you can deploy, and if left unchecked that list can get out of control. Fix for the Second Issue The way you would usually access a service added this way is to create an instance of the proxy class like so: BrokenServiceClient proxy = new BrokenServiceClient(); Closer inspection of these generated proxy classes reveals that there are a few overloaded constructors, one of which allows you to specify the end point address to use when creating the proxy. From here all you have to do is come up with some logic that will provide you with the relative path to your services. Since my WCF services are usually hosted in the same project as my Silverlight app I use the class shown below: public class ServiceProxyHelper { /// <summary> /// Create a broken service proxy /// </summary> /// <returns>A broken service proxy</returns> public static BrokenServiceClient CreateBrokenServiceProxy() { Uri address = new Uri(Application.Current.Host.Source, "../Services/BrokenService.svc"); return new BrokenServiceClient("CustomBinding_BrokenService", address.AbsoluteUri); } } Then I will create an instance of the proxy class using my service helper class like so: BrokenServiceClient proxy = ServiceProxyHelper.CreateBrokenServiceProxy(); The way this works is “Application.Current.Host.Source” will return the URL to the ClientBin folder the Silverlight app is hosted in, the “../Services/BrokenService.svc” is then used as the relative path to the service from the ClientBin folder, combined by the Uri object this gives me the URL to my service. The “CustomBinding_BrokenService” is a reference to the end point configuration in the ServiceReferences.ClientConfig file. Yes this means you still need the ServiceReferences.ClientConfig file. All this is doing is using a different end point address than the one specified in the ServiceReferences.ClientConfig file, all the other settings form the ServiceReferences.ClientConfig file are still used when creating the proxy. I have uploaded an example project which covers the custom host factory solution from the first issue and everything from the second issue. I included the code to write a list of base addresses to a log file in my implementation of the custom host factory, this is not need for the custom host factory to function and can safely be removed. Download (WCFServicesDeploymentExample.zip)

    Read the article

  • uwsgi_params not in nginx

    - by Halit Alptekin
    Firstly I setup nginx and uwsgi via apt-get. And,I add the line to nginx conf file(/etc/nginx/conf.d/default.conf) like below line; server { listen 80; server_name <replace with your hostname>; #Replace paths for real deployments... access_log /tmp/access.log; error_log /tmp/error.log; location / { include uwsgi_params; uwsgi_pass 127.0.0.1:8889; } } I had a error; Starting nginx: [emerg]: open() "/etc/nginx/uwsgi_params" failed (2: No such file or directory) in /etc/nginx/conf.d/default.conf:11 configuration file /etc/nginx/nginx.conf test failed If I add uwsgi_params file from uwsgi's source;I had a simple error. Thanks

    Read the article

  • iTunes 9.2 OS X - iTunes DJ not functioning correctly

    - by David Sykes
    I have been using iTunes on my Mac for 4 years now without any problems at all. Recently, however, the iTunes DJ function has started misbehaving. Instead of showing the 5 previously played songs, and 15 upcoming songs randomly selected from the playlist I have defined in the settings, it instead shows the 5 previously played songs, plus many more than 15 upcoming songs. In addition, the upcoming songs are not from the playlist, they are from about 7 or so albums, in album order. I have changed settings, change the source playlist, restarted iTunes, rebooted the machine. All with no change. Does anybody have any idea how to get this working again? Thanks, Dave

    Read the article

  • Why can blocked IPs get through my iptables? What's wrong with this configuration?

    - by NeedSomeHelp
    (Why can/How are) blocked IPs (get/getting) through my iptables? Hello and thanks for your consideration... I have configured iptables and included (below) output from the command "iptables --line-numbers -n -L" yet IP addresses (like 31.41.219.180) from IP blocks I have already blocked are getting through. Please take a look and share any input you may have. Thank you. P.S. The initial ACCEPT IP addresses are for CloudFlare. . Chain INPUT (policy DROP 0 packets, 0 bytes) num pkts bytes target prot opt in out source destination 1 32267 14M ACCEPT all -- * * 0.0.0.0/0 0.0.0.0/0 state RELATED,ESTABLISHED 2 0 0 REJECT tcp -- * * 0.0.0.0/0 0.0.0.0/0 tcp flags:!0x17/0x02 state NEW reject-with tcp-reset 3 149 8570 DROP all -- * * 0.0.0.0/0 0.0.0.0/0 state INVALID 4 434 25606 ACCEPT all -- lo * 0.0.0.0/0 0.0.0.0/0 5 0 0 ACCEPT udp -- * * 103.21.244.0/22 0.0.0.0/0 6 0 0 ACCEPT udp -- * * 103.22.200.0/22 0.0.0.0/0 7 0 0 ACCEPT udp -- * * 103.31.4.0/22 0.0.0.0/0 8 0 0 ACCEPT udp -- * * 104.16.0.0/12 0.0.0.0/0 9 0 0 ACCEPT udp -- * * 108.162.192.0/18 0.0.0.0/0 10 0 0 ACCEPT udp -- * * 141.101.64.0/18 0.0.0.0/0 11 0 0 ACCEPT udp -- * * 162.158.0.0/15 0.0.0.0/0 12 0 0 ACCEPT udp -- * * 173.245.48.0/20 0.0.0.0/0 13 0 0 ACCEPT udp -- * * 188.114.96.0/20 0.0.0.0/0 14 0 0 ACCEPT udp -- * * 190.93.240.0/20 0.0.0.0/0 15 0 0 ACCEPT udp -- * * 197.234.240.0/22 0.0.0.0/0 16 0 0 ACCEPT udp -- * * 198.41.128.0/17 0.0.0.0/0 17 0 0 ACCEPT udp -- * * 199.27.128.0/21 0.0.0.0/0 18 0 0 ACCEPT tcp -- * * 103.21.244.0/22 0.0.0.0/0 19 9 468 ACCEPT tcp -- * * 103.22.200.0/22 0.0.0.0/0 20 0 0 ACCEPT tcp -- * * 103.31.4.0/22 0.0.0.0/0 21 0 0 ACCEPT tcp -- * * 104.16.0.0/12 0.0.0.0/0 22 858 44616 ACCEPT tcp -- * * 108.162.192.0/18 0.0.0.0/0 23 376 19552 ACCEPT tcp -- * * 141.101.64.0/18 0.0.0.0/0 24 0 0 ACCEPT tcp -- * * 162.158.0.0/15 0.0.0.0/0 25 257 13364 ACCEPT tcp -- * * 173.245.48.0/20 0.0.0.0/0 26 0 0 ACCEPT tcp -- * * 188.114.96.0/20 0.0.0.0/0 27 0 0 ACCEPT tcp -- * * 190.93.240.0/20 0.0.0.0/0 28 0 0 ACCEPT tcp -- * * 197.234.240.0/22 0.0.0.0/0 29 0 0 ACCEPT tcp -- * * 198.41.128.0/17 0.0.0.0/0 30 92 4784 ACCEPT tcp -- * * 199.27.128.0/21 0.0.0.0/0 31 0 0 DROP tcp -- * * 1.0.0.0/8 0.0.0.0/0 tcp dpts:1:50000 32 0 0 DROP tcp -- * * 101.0.0.0/8 0.0.0.0/0 tcp dpts:1:50000 33 0 0 DROP tcp -- * * 102.0.0.0/8 0.0.0.0/0 tcp dpts:1:50000 34 0 0 DROP tcp -- * * 103.0.0.0/8 0.0.0.0/0 tcp dpts:1:50000 35 18 1080 DROP tcp -- * * 109.0.0.0/8 0.0.0.0/0 tcp dpts:1:50000 36 0 0 DROP tcp -- * * 112.0.0.0/8 0.0.0.0/0 tcp dpts:1:50000 37 12 656 DROP tcp -- * * 113.0.0.0/8 0.0.0.0/0 tcp dpts:1:50000 38 0 0 DROP tcp -- * * 114.0.0.0/8 0.0.0.0/0 tcp dpts:1:50000 39 0 0 DROP tcp -- * * 115.0.0.0/8 0.0.0.0/0 tcp dpts:1:50000 40 8 352 DROP tcp -- * * 116.0.0.0/8 0.0.0.0/0 tcp dpts:1:50000 41 0 0 DROP tcp -- * * 117.0.0.0/8 0.0.0.0/0 tcp dpts:1:50000 42 0 0 DROP tcp -- * * 118.0.0.0/8 0.0.0.0/0 tcp dpts:1:50000 43 2 120 DROP tcp -- * * 119.0.0.0/8 0.0.0.0/0 tcp dpts:1:50000 44 0 0 DROP tcp -- * * 120.0.0.0/8 0.0.0.0/0 tcp dpts:1:50000 45 0 0 DROP tcp -- * * 121.0.0.0/8 0.0.0.0/0 tcp dpts:1:50000 46 4 160 DROP tcp -- * * 122.0.0.0/8 0.0.0.0/0 tcp dpts:1:50000 47 4 240 DROP tcp -- * * 123.0.0.0/8 0.0.0.0/0 tcp dpts:1:50000 48 0 0 DROP tcp -- * * 125.0.0.0/8 0.0.0.0/0 tcp dpts:1:50000 49 0 0 DROP tcp -- * * 134.0.0.0/8 0.0.0.0/0 tcp dpts:1:50000 50 0 0 DROP tcp -- * * 146.185.0.0/16 0.0.0.0/0 tcp dpts:1:50000 51 6 360 DROP tcp -- * * 148.0.0.0/8 0.0.0.0/0 tcp dpts:1:50000 52 0 0 DROP tcp -- * * 151.0.0.0/8 0.0.0.0/0 tcp dpts:1:50000 53 0 0 DROP tcp -- * * 175.0.0.0/8 0.0.0.0/0 tcp dpts:1:50000 54 0 0 DROP tcp -- * * 176.0.0.0/8 0.0.0.0/0 tcp dpts:1:50000 55 0 0 DROP tcp -- * * 177.0.0.0/8 0.0.0.0/0 tcp dpts:1:50000 56 46 2696 DROP tcp -- * * 178.0.0.0/8 0.0.0.0/0 tcp dpts:1:50000 57 0 0 DROP tcp -- * * 179.0.0.0/8 0.0.0.0/0 tcp dpts:1:50000 58 4 224 DROP tcp -- * * 180.0.0.0/8 0.0.0.0/0 tcp dpts:1:50000 59 0 0 DROP tcp -- * * 181.0.0.0/8 0.0.0.0/0 tcp dpts:1:50000 60 0 0 DROP tcp -- * * 182.0.0.0/8 0.0.0.0/0 tcp dpts:1:50000 61 34 2040 DROP tcp -- * * 183.0.0.0/8 0.0.0.0/0 tcp dpts:1:50000 62 0 0 DROP tcp -- * * 185.0.0.0/8 0.0.0.0/0 tcp dpts:1:50000 63 0 0 DROP tcp -- * * 186.0.0.0/8 0.0.0.0/0 tcp dpts:1:50000 64 0 0 DROP tcp -- * * 187.0.0.0/8 0.0.0.0/0 tcp dpts:1:50000 65 18 912 DROP tcp -- * * 188.0.0.0/8 0.0.0.0/0 tcp dpts:1:50000 66 0 0 DROP tcp -- * * 189.0.0.0/8 0.0.0.0/0 tcp dpts:1:50000 67 0 0 DROP tcp -- * * 190.0.0.0/8 0.0.0.0/0 tcp dpts:1:50000 68 2 120 DROP tcp -- * * 192.0.0.0/8 0.0.0.0/0 tcp dpts:1:50000 69 0 0 DROP tcp -- * * 196.0.0.0/8 0.0.0.0/0 tcp dpts:1:50000 70 0 0 DROP tcp -- * * 197.0.0.0/8 0.0.0.0/0 tcp dpts:1:50000 71 5 300 DROP tcp -- * * 198.0.0.0/8 0.0.0.0/0 tcp dpts:1:50000 72 0 0 DROP tcp -- * * 2.0.0.0/8 0.0.0.0/0 tcp dpts:1:50000 73 0 0 DROP tcp -- * * 200.0.0.0/8 0.0.0.0/0 tcp dpts:1:50000 74 0 0 DROP tcp -- * * 201.0.0.0/8 0.0.0.0/0 tcp dpts:1:50000 75 6 360 DROP tcp -- * * 202.0.0.0/8 0.0.0.0/0 tcp dpts:1:50000 76 0 0 DROP tcp -- * * 203.0.0.0/8 0.0.0.0/0 tcp dpts:1:50000 77 4 160 DROP tcp -- * * 210.0.0.0/8 0.0.0.0/0 tcp dpts:1:50000 78 0 0 DROP tcp -- * * 211.0.0.0/8 0.0.0.0/0 tcp dpts:1:50000 79 2 96 DROP tcp -- * * 212.0.0.0/8 0.0.0.0/0 tcp dpts:1:50000 80 4 240 DROP tcp -- * * 213.0.0.0/8 0.0.0.0/0 tcp dpts:1:50000 81 0 0 DROP tcp -- * * 214.0.0.0/8 0.0.0.0/0 tcp dpts:1:50000 82 0 0 DROP tcp -- * * 215.0.0.0/8 0.0.0.0/0 tcp dpts:1:50000 83 0 0 DROP tcp -- * * 216.0.0.0/8 0.0.0.0/0 tcp dpts:1:50000 84 0 0 DROP tcp -- * * 217.0.0.0/8 0.0.0.0/0 tcp dpts:1:50000 85 4 172 DROP tcp -- * * 218.0.0.0/8 0.0.0.0/0 tcp dpts:1:50000 86 12 576 DROP tcp -- * * 219.0.0.0/8 0.0.0.0/0 tcp dpts:1:50000 87 7 372 DROP tcp -- * * 220.0.0.0/8 0.0.0.0/0 tcp dpts:1:50000 88 0 0 DROP tcp -- * * 222.0.0.0/8 0.0.0.0/0 tcp dpts:1:50000 89 0 0 DROP tcp -- * * 27.0.0.0/8 0.0.0.0/0 tcp dpts:1:50000 90 12 608 DROP tcp -- * * 31.0.0.0/8 0.0.0.0/0 tcp dpts:1:50000 91 11 528 DROP tcp -- * * 37.0.0.0/8 0.0.0.0/0 tcp dpts:1:50000 92 0 0 DROP tcp -- * * 41.0.0.0/8 0.0.0.0/0 tcp dpts:1:50000 93 0 0 DROP tcp -- * * 42.0.0.0/8 0.0.0.0/0 tcp dpts:1:50000 94 0 0 DROP tcp -- * * 43.0.0.0/8 0.0.0.0/0 tcp dpts:1:50000 95 8 480 DROP tcp -- * * 46.0.0.0/8 0.0.0.0/0 tcp dpts:1:50000 96 0 0 DROP tcp -- * * 49.0.0.0/8 0.0.0.0/0 tcp dpts:1:50000 97 6 360 DROP tcp -- * * 5.0.0.0/8 0.0.0.0/0 tcp dpts:1:50000 98 0 0 DROP tcp -- * * 58.0.0.0/8 0.0.0.0/0 tcp dpts:1:50000 99 0 0 DROP tcp -- * * 60.0.0.0/8 0.0.0.0/0 tcp dpts:1:50000 100 4 160 DROP tcp -- * * 61.0.0.0/8 0.0.0.0/0 tcp dpts:1:50000 101 32 1848 DROP tcp -- * * 62.0.0.0/8 0.0.0.0/0 tcp dpts:1:50000 102 0 0 DROP tcp -- * * 63.0.0.0/8 0.0.0.0/0 tcp dpts:1:50000 103 20 1200 DROP tcp -- * * 64.0.0.0/8 0.0.0.0/0 tcp dpts:1:50000 104 0 0 DROP tcp -- * * 65.0.0.0/8 0.0.0.0/0 tcp dpts:1:50000 105 266 15960 DROP tcp -- * * 66.0.0.0/8 0.0.0.0/0 tcp dpts:1:50000 106 3 180 DROP tcp -- * * 69.0.0.0/8 0.0.0.0/0 tcp dpts:1:50000 107 5 272 DROP tcp -- * * 72.0.0.0/8 0.0.0.0/0 tcp dpts:1:50000 108 0 0 DROP tcp -- * * 78.0.0.0/8 0.0.0.0/0 tcp dpts:1:50000 109 0 0 DROP tcp -- * * 81.0.0.0/8 0.0.0.0/0 tcp dpts:1:50000 110 3 180 DROP tcp -- * * 82.0.0.0/8 0.0.0.0/0 tcp dpts:1:50000 111 0 0 DROP tcp -- * * 83.0.0.0/8 0.0.0.0/0 tcp dpts:1:50000 112 8 384 DROP tcp -- * * 84.0.0.0/8 0.0.0.0/0 tcp dpts:1:50000 113 0 0 DROP tcp -- * * 85.0.0.0/8 0.0.0.0/0 tcp dpts:1:50000 114 0 0 DROP tcp -- * * 86.0.0.0/8 0.0.0.0/0 tcp dpts:1:50000 115 6 360 DROP tcp -- * * 87.0.0.0/8 0.0.0.0/0 tcp dpts:1:50000 116 7 408 DROP tcp -- * * 88.0.0.0/8 0.0.0.0/0 tcp dpts:1:50000 117 0 0 DROP tcp -- * * 89.0.0.0/8 0.0.0.0/0 tcp dpts:1:50000 118 0 0 DROP tcp -- * * 90.0.0.0/8 0.0.0.0/0 tcp dpts:1:50000 119 0 0 DROP tcp -- * * 91.0.0.0/8 0.0.0.0/0 tcp dpts:1:50000 120 3 152 DROP tcp -- * * 92.0.0.0/8 0.0.0.0/0 tcp dpts:1:50000 121 20 992 DROP tcp -- * * 93.0.0.0/8 0.0.0.0/0 tcp dpts:1:50000 122 9 512 DROP tcp -- * * 94.0.0.0/8 0.0.0.0/0 tcp dpts:1:50000 123 5 272 DROP tcp -- * * 95.0.0.0/8 0.0.0.0/0 tcp dpts:1:50000 124 0 0 DROP udp -- * * 1.0.0.0/8 0.0.0.0/0 udp dpts:1:50000 125 0 0 DROP udp -- * * 101.0.0.0/8 0.0.0.0/0 udp dpts:1:50000 126 0 0 DROP udp -- * * 102.0.0.0/8 0.0.0.0/0 udp dpts:1:50000 127 0 0 DROP udp -- * * 103.0.0.0/8 0.0.0.0/0 udp dpts:1:50000 128 0 0 DROP udp -- * * 109.0.0.0/8 0.0.0.0/0 udp dpts:1:50000 129 0 0 DROP udp -- * * 112.0.0.0/8 0.0.0.0/0 udp dpts:1:50000 130 0 0 DROP udp -- * * 113.0.0.0/8 0.0.0.0/0 udp dpts:1:50000 131 0 0 DROP udp -- * * 114.0.0.0/8 0.0.0.0/0 udp dpts:1:50000 132 1 112 DROP udp -- * * 115.0.0.0/8 0.0.0.0/0 udp dpts:1:50000 133 0 0 DROP udp -- * * 116.0.0.0/8 0.0.0.0/0 udp dpts:1:50000 134 0 0 DROP udp -- * * 117.0.0.0/8 0.0.0.0/0 udp dpts:1:50000 135 0 0 DROP udp -- * * 118.0.0.0/8 0.0.0.0/0 udp dpts:1:50000 136 0 0 DROP udp -- * * 119.0.0.0/8 0.0.0.0/0 udp dpts:1:50000 137 0 0 DROP udp -- * * 120.0.0.0/8 0.0.0.0/0 udp dpts:1:50000 138 0 0 DROP udp -- * * 121.0.0.0/8 0.0.0.0/0 udp dpts:1:50000 139 0 0 DROP udp -- * * 122.0.0.0/8 0.0.0.0/0 udp dpts:1:50000 140 0 0 DROP udp -- * * 123.0.0.0/8 0.0.0.0/0 udp dpts:1:50000 141 0 0 DROP udp -- * * 125.0.0.0/8 0.0.0.0/0 udp dpts:1:50000 142 0 0 DROP udp -- * * 134.0.0.0/8 0.0.0.0/0 udp dpts:1:50000 143 0 0 DROP udp -- * * 146.185.0.0/16 0.0.0.0/0 udp dpts:1:50000 144 0 0 DROP udp -- * * 148.0.0.0/8 0.0.0.0/0 udp dpts:1:50000 145 0 0 DROP udp -- * * 151.0.0.0/8 0.0.0.0/0 udp dpts:1:50000 146 0 0 DROP udp -- * * 175.0.0.0/8 0.0.0.0/0 udp dpts:1:50000 147 0 0 DROP udp -- * * 176.0.0.0/8 0.0.0.0/0 udp dpts:1:50000 148 1 70 DROP udp -- * * 177.0.0.0/8 0.0.0.0/0 udp dpts:1:50000 149 0 0 DROP udp -- * * 178.0.0.0/8 0.0.0.0/0 udp dpts:1:50000 150 0 0 DROP udp -- * * 179.0.0.0/8 0.0.0.0/0 udp dpts:1:50000 151 0 0 DROP udp -- * * 180.0.0.0/8 0.0.0.0/0 udp dpts:1:50000 152 0 0 DROP udp -- * * 181.0.0.0/8 0.0.0.0/0 udp dpts:1:50000 153 0 0 DROP udp -- * * 182.0.0.0/8 0.0.0.0/0 udp dpts:1:50000 154 0 0 DROP udp -- * * 183.0.0.0/8 0.0.0.0/0 udp dpts:1:50000 155 0 0 DROP udp -- * * 185.0.0.0/8 0.0.0.0/0 udp dpts:1:50000 156 1 74 DROP udp -- * * 186.0.0.0/8 0.0.0.0/0 udp dpts:1:50000 157 0 0 DROP udp -- * * 187.0.0.0/8 0.0.0.0/0 udp dpts:1:50000 158 0 0 DROP udp -- * * 188.0.0.0/8 0.0.0.0/0 udp dpts:1:50000 159 0 0 DROP udp -- * * 189.0.0.0/8 0.0.0.0/0 udp dpts:1:50000 160 0 0 DROP udp -- * * 190.0.0.0/8 0.0.0.0/0 udp dpts:1:50000 161 0 0 DROP udp -- * * 192.0.0.0/8 0.0.0.0/0 udp dpts:1:50000 162 0 0 DROP udp -- * * 196.0.0.0/8 0.0.0.0/0 udp dpts:1:50000 163 0 0 DROP udp -- * * 197.0.0.0/8 0.0.0.0/0 udp dpts:1:50000 164 0 0 DROP udp -- * * 198.0.0.0/8 0.0.0.0/0 udp dpts:1:50000 165 0 0 DROP udp -- * * 2.0.0.0/8 0.0.0.0/0 udp dpts:1:50000 166 0 0 DROP udp -- * * 200.0.0.0/8 0.0.0.0/0 udp dpts:1:50000 167 0 0 DROP udp -- * * 201.0.0.0/8 0.0.0.0/0 udp dpts:1:50000 168 0 0 DROP udp -- * * 202.0.0.0/8 0.0.0.0/0 udp dpts:1:50000 169 0 0 DROP udp -- * * 203.0.0.0/8 0.0.0.0/0 udp dpts:1:50000 170 0 0 DROP udp -- * * 210.0.0.0/8 0.0.0.0/0 udp dpts:1:50000 171 0 0 DROP udp -- * * 211.0.0.0/8 0.0.0.0/0 udp dpts:1:50000 172 0 0 DROP udp -- * * 212.0.0.0/8 0.0.0.0/0 udp dpts:1:50000 173 0 0 DROP udp -- * * 213.0.0.0/8 0.0.0.0/0 udp dpts:1:50000 174 0 0 DROP udp -- * * 214.0.0.0/8 0.0.0.0/0 udp dpts:1:50000 175 0 0 DROP udp -- * * 215.0.0.0/8 0.0.0.0/0 udp dpts:1:50000 176 0 0 DROP udp -- * * 216.0.0.0/8 0.0.0.0/0 udp dpts:1:50000 177 0 0 DROP udp -- * * 217.0.0.0/8 0.0.0.0/0 udp dpts:1:50000 178 1 80 DROP udp -- * * 218.0.0.0/8 0.0.0.0/0 udp dpts:1:50000 179 0 0 DROP udp -- * * 219.0.0.0/8 0.0.0.0/0 udp dpts:1:50000 180 0 0 DROP udp -- * * 220.0.0.0/8 0.0.0.0/0 udp dpts:1:50000 181 0 0 DROP udp -- * * 222.0.0.0/8 0.0.0.0/0 udp dpts:1:50000 182 0 0 DROP udp -- * * 27.0.0.0/8 0.0.0.0/0 udp dpts:1:50000 183 0 0 DROP udp -- * * 31.0.0.0/8 0.0.0.0/0 udp dpts:1:50000 184 0 0 DROP udp -- * * 37.0.0.0/8 0.0.0.0/0 udp dpts:1:50000 185 0 0 DROP udp -- * * 41.0.0.0/8 0.0.0.0/0 udp dpts:1:50000 186 0 0 DROP udp -- * * 42.0.0.0/8 0.0.0.0/0 udp dpts:1:50000 187 0 0 DROP udp -- * * 43.0.0.0/8 0.0.0.0/0 udp dpts:1:50000 188 0 0 DROP udp -- * * 46.0.0.0/8 0.0.0.0/0 udp dpts:1:50000 189 0 0 DROP udp -- * * 49.0.0.0/8 0.0.0.0/0 udp dpts:1:50000 190 0 0 DROP udp -- * * 5.0.0.0/8 0.0.0.0/0 udp dpts:1:50000 191 0 0 DROP udp -- * * 58.0.0.0/8 0.0.0.0/0 udp dpts:1:50000 192 0 0 DROP udp -- * * 60.0.0.0/8 0.0.0.0/0 udp dpts:1:50000 193 0 0 DROP udp -- * * 61.0.0.0/8 0.0.0.0/0 udp dpts:1:50000 194 0 0 DROP udp -- * * 62.0.0.0/8 0.0.0.0/0 udp dpts:1:50000 195 0 0 DROP udp -- * * 63.0.0.0/8 0.0.0.0/0 udp dpts:1:50000 196 0 0 DROP udp -- * * 64.0.0.0/8 0.0.0.0/0 udp dpts:1:50000 197 0 0 DROP udp -- * * 65.0.0.0/8 0.0.0.0/0 udp dpts:1:50000 198 0 0 DROP udp -- * * 66.0.0.0/8 0.0.0.0/0 udp dpts:1:50000 199 0 0 DROP udp -- * * 69.0.0.0/8 0.0.0.0/0 udp dpts:1:50000 200 0 0 DROP udp -- * * 72.0.0.0/8 0.0.0.0/0 udp dpts:1:50000 201 0 0 DROP udp -- * * 78.0.0.0/8 0.0.0.0/0 udp dpts:1:50000 202 0 0 DROP udp -- * * 81.0.0.0/8 0.0.0.0/0 udp dpts:1:50000 203 0 0 DROP udp -- * * 82.0.0.0/8 0.0.0.0/0 udp dpts:1:50000 204 0 0 DROP udp -- * * 83.0.0.0/8 0.0.0.0/0 udp dpts:1:50000 205 0 0 DROP udp -- * * 84.0.0.0/8 0.0.0.0/0 udp dpts:1:50000 206 0 0 DROP udp -- * * 85.0.0.0/8 0.0.0.0/0 udp dpts:1:50000 207 0 0 DROP udp -- * * 86.0.0.0/8 0.0.0.0/0 udp dpts:1:50000 208 0 0 DROP udp -- * * 87.0.0.0/8 0.0.0.0/0 udp dpts:1:50000 209 0 0 DROP udp -- * * 88.0.0.0/8 0.0.0.0/0 udp dpts:1:50000 210 0 0 DROP udp -- * * 89.0.0.0/8 0.0.0.0/0 udp dpts:1:50000 211 0 0 DROP udp -- * * 90.0.0.0/8 0.0.0.0/0 udp dpts:1:50000 212 0 0 DROP udp -- * * 91.0.0.0/8 0.0.0.0/0 udp dpts:1:50000 213 0 0 DROP udp -- * * 92.0.0.0/8 0.0.0.0/0 udp dpts:1:50000 214 2 72 DROP udp -- * * 93.0.0.0/8 0.0.0.0/0 udp dpts:1:50000 215 0 0 DROP udp -- * * 94.0.0.0/8 0.0.0.0/0 udp dpts:1:50000 216 0 0 DROP udp -- * * 95.0.0.0/8 0.0.0.0/0 udp dpts:1:50000 217 0 0 DROP tcp -- * * 0.0.0.0/0 0.0.0.0/0 tcp dpt:12443 218 0 0 ACCEPT tcp -- * * 0.0.0.0/0 0.0.0.0/0 tcp dpt:11443 219 0 0 ACCEPT tcp -- * * 0.0.0.0/0 0.0.0.0/0 tcp dpt:11444 220 23 1104 ACCEPT tcp -- * * 0.0.0.0/0 0.0.0.0/0 tcp dpt:8447 221 24 1152 ACCEPT tcp -- * * 0.0.0.0/0 0.0.0.0/0 tcp dpt:8443 222 0 0 ACCEPT tcp -- * * 0.0.0.0/0 0.0.0.0/0 tcp dpt:8880 223 207 11096 ACCEPT tcp -- * * 0.0.0.0/0 0.0.0.0/0 tcp dpt:80 224 19 996 ACCEPT tcp -- * * 0.0.0.0/0 0.0.0.0/0 tcp dpt:443 225 0 0 ACCEPT tcp -- * * 0.0.0.0/0 0.0.0.0/0 tcp dpt:21 226 0 0 ACCEPT tcp -- * * 0.0.0.0/0 0.0.0.0/0 tcp dpt:22 227 0 0 ACCEPT tcp -- * * 0.0.0.0/0 0.0.0.0/0 tcp dpt:587 228 4 216 ACCEPT tcp -- * * 0.0.0.0/0 0.0.0.0/0 tcp dpt:25 229 0 0 ACCEPT tcp -- * * 0.0.0.0/0 0.0.0.0/0 tcp dpt:465 230 14 840 ACCEPT tcp -- * * 0.0.0.0/0 0.0.0.0/0 tcp dpt:110 231 2 120 ACCEPT tcp -- * * 0.0.0.0/0 0.0.0.0/0 tcp dpt:995 232 0 0 ACCEPT tcp -- * * 0.0.0.0/0 0.0.0.0/0 tcp dpt:143 233 0 0 ACCEPT tcp -- * * 0.0.0.0/0 0.0.0.0/0 tcp dpt:993 234 0 0 ACCEPT tcp -- * * 0.0.0.0/0 0.0.0.0/0 tcp dpt:106 235 0 0 ACCEPT tcp -- * * 0.0.0.0/0 0.0.0.0/0 tcp dpt:3306 236 0 0 ACCEPT tcp -- * * 0.0.0.0/0 0.0.0.0/0 tcp dpt:5432 237 0 0 ACCEPT tcp -- * * 0.0.0.0/0 0.0.0.0/0 tcp dpt:9008 238 0 0 ACCEPT tcp -- * * 0.0.0.0/0 0.0.0.0/0 tcp dpt:9080 239 0 0 ACCEPT udp -- * * 0.0.0.0/0 0.0.0.0/0 udp dpt:137 240 0 0 ACCEPT udp -- * * 0.0.0.0/0 0.0.0.0/0 udp dpt:138 241 0 0 ACCEPT tcp -- * * 0.0.0.0/0 0.0.0.0/0 tcp dpt:139 242 0 0 ACCEPT tcp -- * * 0.0.0.0/0 0.0.0.0/0 tcp dpt:445 243 0 0 ACCEPT udp -- * * 0.0.0.0/0 0.0.0.0/0 udp dpt:1194 244 0 0 ACCEPT udp -- * * 0.0.0.0/0 0.0.0.0/0 udp dpt:53 245 0 0 ACCEPT tcp -- * * 0.0.0.0/0 0.0.0.0/0 tcp dpt:53 246 73 4488 ACCEPT icmp -- * * 0.0.0.0/0 0.0.0.0/0 icmp type 8 code 0 247 77 23598 DROP all -- * * 0.0.0.0/0 0.0.0.0/0 Chain FORWARD (policy DROP 0 packets, 0 bytes) num pkts bytes target prot opt in out source destination 1 0 0 ACCEPT all -- * * 0.0.0.0/0 0.0.0.0/0 state RELATED,ESTABLISHED 2 0 0 REJECT tcp -- * * 0.0.0.0/0 0.0.0.0/0 tcp flags:!0x17/0x02 state NEW reject-with tcp-reset 3 0 0 DROP all -- * * 0.0.0.0/0 0.0.0.0/0 state INVALID 4 0 0 ACCEPT all -- lo lo 0.0.0.0/0 0.0.0.0/0 5 0 0 DROP all -- * * 0.0.0.0/0 0.0.0.0/0 Chain OUTPUT (policy DROP 0 packets, 0 bytes) num pkts bytes target prot opt in out source destination 1 31004 25M ACCEPT all -- * * 0.0.0.0/0 0.0.0.0/0 state RELATED,ESTABLISHED 2 1 333 REJECT tcp -- * * 0.0.0.0/0 0.0.0.0/0 tcp flags:!0x17/0x02 state NEW reject-with tcp-reset 3 0 0 DROP all -- * * 0.0.0.0/0 0.0.0.0/0 state INVALID 4 434 25606 ACCEPT all -- * lo 0.0.0.0/0 0.0.0.0/0 5 328 21324 ACCEPT all -- * * 0.0.0.0/0 0.0.0.0/0

    Read the article

< Previous Page | 579 580 581 582 583 584 585 586 587 588 589 590  | Next Page >