Search Results

Search found 227 results on 10 pages for 'classification'.

Page 6/10 | < Previous Page | 2 3 4 5 6 7 8 9 10  | Next Page >

  • The true cost to get my XNA game on XBox?

    - by Fëanor
    There seem to be many hurdles to get ones game onto Xbox, so far I have uncovered: You need Visual Studio (once your game becomes commercial you cannot use Express - but have to pay for professional). $1000+ You then buy a XBox to find you also need a harddrive - so buy a Xbox harddrive too. $400 You need to buy XBox Gold LIVE subscription. $70 You need to buy AppHub Creators Club subscription $100 Then after all that I cannot even find the place on my XBox to download Indie games?!!! Seriously WTF - after doing all this I could have come proficient in WebGL and done it all for free... Before I go all the way down this path (hole) are there any other hidden hurdels before I can publish my game? UPDATE: "Indie Games are not available in Australia, due to the requirement for all games to be rated by the Australian Classification Board, and the prohibitive expenses involved."....... im going to have to break something....

    Read the article

  • c# multi - point selection on an image

    - by sinem
    I'm designing interface with visual c# for my image processing project. i'm not good at c# codding. and i haven't enough time. I need point selection on an image and classification these points for using in my image processing code at vhdl. Orginal image will stay at a picturebox and i will use selected image in another picturebox. How i can select points on an image ? please could you help me or send me similar code? I'm getting so confused thank you.

    Read the article

  • Just Released: Oracle Instantis EnterpriseTrack 8.5

    - by Melissa Centurio Lopes
    Instantis EnterpriseTrack has been successfully integrated into the Oracle development process and the first release under Oracle is generally available. This release is a significant expansion of solution capabilities in resource management, project demand management, and project execution. It also includes customer requested product enhancements, reporting enhancements, performance optimizations, and user experience improvements. Key enhancements include: New Resource Calendar functionality provides more precise capacity visibility for resource planning and increases project plan reliability. Support for Activity Labor Cost Capitalization allows project teams to easily mark any WBS activity as CapEx or OpEx with Labor Expense Type and enforce proper classification of Labor Expense Type for activities by setting defaults through activity templates or at project level New Variable Resource Rates functionality allows project stakeholders to specify resource rate accurately over time and account for wage revisions Instantis EnterpriseTrack cloud and on-premise solutions provide a top-down approach to managing, tracking and reporting on enterprise strategies, projects, portfolios, processes, resources, and financials. Upgrade now or Visit the Instantis EnterpriseTrack site to learn more.

    Read the article

  • details on the following Natural Language Processing terms ?

    - by wefwgeweg
    Named Entity Extraction (extract ppl, cities, organizations) Content Tagging (extract topic tags by scanning doc) Structured Data Extraction Topic Categorization (taxonomy classification by scanning doc....bayesian ) Text extraction (HTML page cleaning) are there libraries that i can use to do any of the above functions of NLP ? dont really feel like forking out cash to AlchemyAPI

    Read the article

  • Any Naive Bayesian Classifier in python?

    - by asldkncvas
    Dear Everyone I have tried the Orange Framework for Naive Bayesian classification. The methods are extremely unintuitive, and the documentation is extremely unorganized. Does anyone here have another framework to recommend? I use mostly NaiveBayesian for now. I was thinking of using nltk's NaiveClassification but then they don't think they can handle continuous variables. What are my options?

    Read the article

  • Python regex to parse text file, get the items in list and count the list

    - by Nemo
    I have a text file which contains some data. I m particularly interested in finding the count of the number of items in v_dims v_dims pattern in my text file looks like this : v_dims={ "Sales", "Product Family", "Sales Organization", "Region", "Sales Area", "Sales office", "Sales Division", "Sales Person", "Sales Channel", "Sales Order Type", "Sales Number", "Sales Person", "Sales Quantity", "Sales Amount" } So I m thinking of getting all the elements in v_dims and dumping them out in a Python list. Then compute the len(mylist) to get the count of the items. The challenge is in getting all the elements of v_dims from my text file and putting them in an empty list. I m particularly interested in items in v_dims in my text file. The text file has data in the form of v_dims pattern i showed in my original post. Some data has nested patterns of v_dims. Thanks. Here's what I have tried and failed. Any help is appreciated. TIA. import re fname = "C:\Users\XXXX\Test.mrk" with open(fname, "r") as fo: content_as_string = fo.read() match = re.findall(r'v_dims={\"(.+?)\"}',content_as_string) Though I have a big text file, Here's a snippet of what's the structure of my text file version "1"; // Computer generated object language file object 'MRKR' "Main" { Data_Type=2, HeaderBlock={ Version_String="6.3 (25)" }, Printer_Info={ Orientation=0, Page_Width=8.50000000, Page_Height=11.00000000, Page_Header="", Page_Footer="", Margin_type=0, Top_Margin=0.50000000, Left_Margin=0.50000000, Bottom_Margin=0.50000000, Right_Margin=0.50000000 }, Marker_Options={ Close_All="TRUE", Hide_Console="FALSE", Console_Left="FALSE", Console_Width=217, Main_Style="Maximized", MDI_Rect={ 0, 0, 892, 1063 } }, Dives={ { Dive="A", Windows={ { View_Index=0, Window_Info={ Window_Rect={ 0, -288, 400, 1008 }, Window_Style="Maximized Front", Window_Name="Theater [Previous Qtr Diveplan-Dive A]" }, Dependent_bool="FALSE", Colset={ Dive_Type="Normal", Dimension_Name="Theater", Action_List={ Actions={ { Action_Type="Select", select_type=5 }, { Action_Type="Select", select_type=0, Key_Names={ "Theater" }, Key_Indexes={ { "AMERICAS" } } }, { Action_Type="Focus", Focus_Rows="True" }, { Action_Type="Dimensions", v_dims={ "Theater", "Product Family", "Division", "Region", "Install at Country Name", "Connect Home Type", "Connect In Type", "SymmConnect Enabled", "Connect Home Refusal Reason", "Sales Order Channel Type", "Maintained By Group", "PS Flag", "Avalanche Flag", "Product Item Family" }, Xtab_Bool="False", Xtab_Flip="False" }, { Action_Type="Select", select_type=5 }, { Action_Type="Select", select_type=0, Key_Names={ "Theater", "Product Family", "Division", "Region", "Install at Country Name", "Connect Home Type", "Connect In Type", "SymmConnect Enabled", "Connect Home Refusal Reason", "Sales Order Channel Type", "Maintained By Group", "PS Flag", "Avalanche Flag" }, Key_Indexes={ { "AMERICAS", "ATMOS", "Latin America CS Division", "37000 CS Region", "Mexico", "", "", "", "", "DIRECT", "EMC", "N", "0" } } } } }, Num_Palette_cols=0, Num_Palette_rows=0 }, Format={ Window_Type="Tabular", Tabular={ Num_row_labels=8 } } } } } }, Widget_Set={ Widget_Layout="Vertical", Go_Button=1, Picklist_Width=0, Sort_Subset_Dimensions="TRUE", Order={ } }, Views={ { Data_Type=1, dbname="Previous Qtr Diveplan", diveline_dbname="Current Qtr Diveplan", logical_name="Current Qtr Diveplan", cols={ { name="Total TSS installs", column_type="Calc[Total TSS installs]", output_type="Number", format_string="." }, { name="TSS Valid Connectivity Records", column_type="Calc[TSS Valid Connectivity Records]", output_type="Number", format_string="." }, { name="% TSS Connectivity Record", column_type="Calc[% TSS Connectivity Record]", output_type="Number" }, { name="TSS Not Applicable", column_type="Calc[TSS Not Applicable]", output_type="Number", format_string="." }, { name="TSS Customer Refusals", column_type="Calc[TSS Customer Refusals]", output_type="Number", format_string="." }, { name="% TSS Refusals", column_type="Calc[% TSS Refusals]", output_type="Number" }, { name="TSS Eligible for Physical Connectivity", column_type="Calc[TSS Eligible for Physical Connectivity]", output_type="Number", format_string="." }, { name="TSS Boxes with Physical Connectivty", column_type="Calc[TSS Boxes with Physical Connectivty]", output_type="Number", format_string="." }, { name="% TSS Physical Connectivity", column_type="Calc[% TSS Physical Connectivity]", output_type="Number" } }, dim_cols={ { name="Model", column_type="Dimension[Model]", output_type="None" }, { name="Model", column_type="Dimension[Model]", output_type="None" }, { name="Connect In Type", column_type="Dimension[Connect In Type]", output_type="None" }, { name="Connect Home Type", column_type="Dimension[Connect Home Type]", output_type="None" }, { name="SymmConnect Enabled", column_type="Dimension[SymmConnect Enabled]", output_type="None" }, { name="Theater", column_type="Dimension[Theater]", output_type="None" }, { name="Division", column_type="Dimension[Division]", output_type="None" }, { name="Region", column_type="Dimension[Region]", output_type="None" }, { name="Sales Order Number", column_type="Dimension[Sales Order Number]", output_type="None" }, { name="Product Item Family", column_type="Dimension[Product Item Family]", output_type="None" }, { name="Item Serial Number", column_type="Dimension[Item Serial Number]", output_type="None" }, { name="Sales Order Deal Number", column_type="Dimension[Sales Order Deal Number]", output_type="None" }, { name="Item Install Date", column_type="Dimension[Item Install Date]", output_type="None" }, { name="SYR Last Dial Home Date", column_type="Dimension[SYR Last Dial Home Date]", output_type="None" }, { name="Maintained By Group", column_type="Dimension[Maintained By Group]", output_type="None" }, { name="PS Flag", column_type="Dimension[PS Flag]", output_type="None" }, { name="Connect Home Refusal Reason", column_type="Dimension[Connect Home Refusal Reason]", output_type="None", col_width=177 }, { name="Cust Name", column_type="Dimension[Cust Name]", output_type="None" }, { name="Sales Order Channel Type", column_type="Dimension[Sales Order Channel Type]", output_type="None" }, { name="Sales Order Type", column_type="Dimension[Sales Order Type]", output_type="None" }, { name="Part Model Key", column_type="Dimension[Part Model Key]", output_type="None" }, { name="Ship Date", column_type="Dimension[Ship Date]", output_type="None" }, { name="Model Number", column_type="Dimension[Model Number]", output_type="None" }, { name="Item Description", column_type="Dimension[Item Description]", output_type="None" }, { name="Customer Classification", column_type="Dimension[Customer Classification]", output_type="None" }, { name="CS Customer Name", column_type="Dimension[CS Customer Name]", output_type="None" }, { name="Install At Customer Number", column_type="Dimension[Install At Customer Number]", output_type="None" }, { name="Install at Country Name", column_type="Dimension[Install at Country Name]", output_type="None" }, { name="TLA Serial Number", column_type="Dimension[TLA Serial Number]", output_type="None" }, { name="Product Version", column_type="Dimension[Product Version]", output_type="None" }, { name="Avalanche Flag", column_type="Dimension[Avalanche Flag]", output_type="None" }, { name="Product Family", column_type="Dimension[Product Family]", output_type="None" }, { name="Project Number", column_type="Dimension[Project Number]", output_type="None" }, { name="PROJECT_STATUS", column_type="Dimension[PROJECT_STATUS]", output_type="None" } }, Available_Columns={ "Total TSS installs", "TSS Valid Connectivity Records", "% TSS Connectivity Record", "TSS Not Applicable", "TSS Customer Refusals", "% TSS Refusals", "TSS Eligible for Physical Connectivity", "TSS Boxes with Physical Connectivty", "% TSS Physical Connectivity", "Total Installs", "All Boxes with Valid Connectivty Record", "% All Connectivity Record", "Overall Refusals", "Overall Refusals %", "All Eligible for Physical Connectivty", "Boxes with Physical Connectivity", "% All with Physical Conectivity" }, Remaining_columns={ { name="Total Installs", column_type="Calc[Total Installs]", output_type="Number", format_string="." }, { name="All Boxes with Valid Connectivty Record", column_type="Calc[All Boxes with Valid Connectivty Record]", output_type="Number", format_string="." }, { name="% All Connectivity Record", column_type="Calc[% All Connectivity Record]", output_type="Number" }, { name="Overall Refusals", column_type="Calc[Overall Refusals]", output_type="Number", format_string="." }, { name="Overall Refusals %", column_type="Calc[Overall Refusals %]", output_type="Number" }, { name="All Eligible for Physical Connectivty", column_type="Calc[All Eligible for Physical Connectivty]", output_type="Number" }, { name="Boxes with Physical Connectivity", column_type="Calc[Boxes with Physical Connectivity]", output_type="Number" }, { name="% All with Physical Conectivity", column_type="Calc[% All with Physical Conectivity]", output_type="Number" } }, calcs={ { name="Total TSS installs", definition="Total[Total TSS installs]", ts_flag="Not TS Calc" }, { name="TSS Valid Connectivity Records", definition="Total[PS Boxes w/ valid connectivity record (1=yes)]", ts_flag="Not TS Calc" }, { name="% TSS Connectivity Record", definition="Total[PS Boxes w/ valid connectivity record (1=yes)] /Total[Total TSS installs]", ts_flag="Not TS Calc" }, { name="TSS Not Applicable", definition="Total[Bozes w/ valid connectivity record (1=yes)]-Total[Boxes Eligible (1=yes)]-Total[TSS Refusals]", ts_flag="Not TS Calc" }, { name="TSS Customer Refusals", definition="Total[TSS Refusals]", ts_flag="Not TS Calc" }, { name="% TSS Refusals", definition="Total[TSS Refusals]/Total[PS Boxes w/ valid connectivity record (1=yes)]", ts_flag="Not TS Calc" }, { name="TSS Eligible for Physical Connectivity", definition="Total[TSS Eligible]-Total[Exception]", ts_flag="Not TS Calc" }, { name="TSS Boxes with Physical Connectivty", definition="Total[PS Physical Connectivity] - Total[PS Physical Connectivity, SymmConnect Enabled=\"Capable not enabled\"]", ts_flag="Not TS Calc" }, { name="% TSS Physical Connectivity", definition="Total[Boxes w/ phys conn]/Total[Boxes Eligible (1=yes)]", ts_flag="Not TS Calc" }, { name="Total Installs", definition="Total[Total Installs]", ts_flag="Not TS Calc" }, { name="All Boxes with Valid Connectivty Record", definition="Total[Bozes w/ valid connectivity record (1=yes)]", ts_flag="Not TS Calc" }, { name="% All Connectivity Record", definition="Total[Bozes w/ valid connectivity record (1=yes)]/Total[Total Installs]", ts_flag="Not TS Calc" }, { name="Overall Refusals", definition="Total[Overall Refusals]", ts_flag="Not TS Calc" }, { name="Overall Refusals %", definition="Total[Overall Refusals]/Total[Bozes w/ valid connectivity record (1=yes)]", ts_flag="Not TS Calc" }, { name="All Eligible for Physical Connectivty", definition="Total[Boxes Eligible (1=yes)]-Total[Exception]", ts_flag="Not TS Calc" }, { name="Boxes with Physical Connectivity", definition="Total[Boxes w/ phys conn]-Total[Boxes w/ phys conn,SymmConnect Enabled=\"Capable not enabled\"]", ts_flag="Not TS Calc" }, { name="% All with Physical Conectivity", definition="Total[Boxes w/ phys conn]/Total[Boxes Eligible (1=yes)]", ts_flag="Not TS Calc" } }, merge_type="consolidate", merge_dbs={ { dbname="connectivityallproducts.mdl", diveline_dbname="/DI_PSREPORTING/connectivityallproducts.mdl" } }, skip_constant_columns="FALSE", categories={ { name="Geography", dimensions={ "Theater", "Division", "Region", "Install at Country Name" } }, { name="Mappings and Flags", dimensions={ "Connect Home Type", "Connect In Type", "SymmConnect Enabled", "Connect Home Refusal Reason", "Sales Order Channel Type", "Maintained By Group", "Customer Installable", "PS Flag", "Top Level Flag", "Avalanche Flag" } }, { name="Product Information", dimensions={ "Product Family", "Product Item Family", "Product Version", "Item Description" } }, { name="Sales Order Info", dimensions={ "Sales Order Deal Number", "Sales Order Number", "Sales Order Type" } }, { name="Dates", dimensions={ "Item Install Date", "Ship Date", "SYR Last Dial Home Date" } }, { name="Details", dimensions={ "Item Serial Number", "TLA Serial Number", "Part Model Key", "Model Number" } }, { name="Customer Infor", dimensions={ "CS Customer Name", "Install At Customer Number", "Customer Classification", "Cust Name" } }, { name="Other Dimensions", dimensions={ "Model" } } }, Maintain_Category_Order="FALSE", popup_info="false" } } };

    Read the article

  • Free Spectral Images database?

    - by Hani
    I am working on a project "object detection using multi spectral imaging", but i am finding troubles because i dont have images to start testing my ideas. Now i am working with the hardware. Please Does any one knows a database for any spectral imaging(faces, flowers,..etc) such that i can test my ideas for classification until i finish the hardware.

    Read the article

  • How to compute the probability of a multi-class prediction using libsvm?

    - by Cuga
    I'm using libsvm and the documentation leads me to believe that there's a way to output the believed probability of an output classification's accuracy. Is this so? And if so, can anyone provide a clear example of how to do it in code? Currently, I'm using the Java libraries in the following manner SvmModel model = Svm.svm_train(problem, parameters); SvmNode x[] = getAnArrayOfSvmNodesForProblem(); double predictedValue = Svm.svm_predict(model, x);

    Read the article

  • Data Mining project ideas?

    - by Andriyev
    Hi I am looking for project ideas in the field of data mining. I expect to complete it in a quarter and intend to use C++, Linux as the environment. The course I'm taking aims to build the basics of data mining and covers topics like Classification, Regression-Modeling, Clustering and Association learning. Please point me to some good ideas which I can chew on. cheers

    Read the article

  • please help me to interpret the naive bayes result in weka..

    - by resmi
    Anybody please help me to interpret the following result generated in weka for classification using naive bayes.....Please explain clearly what is this Normal Distribution , Mean , StandardDev , WeightSum and Precision.Please help me.Am new in weka. ** Naive Bayes Classifier Class Normal: Prior probability = 0.5 1374195_at: Normal Distribution. Mean = 218.06 StandardDev = 6.0572 WeightSum = 3 Precision = 36.34333334 1373315_at: Normal Distribution. Mean = 1142.58 StandardDev = 21.1589 WeightSum = 3 Precision = 126.95333339999999

    Read the article

  • Creating a music catalog in C# and extracting first 30 seconds as soon as the first words are sung

    - by Rad
    I already read a question: Separation of singing voice from music. I don’t need this complex audio processing. I only need some detection mechanism that would detect that there is some voice/vocal playing while the music is playing (or not playing) I need to extract first 30 seconds when a vocalist starts singing along with full band music. See question 2 below. I want to create a music catalog using ASP.NET MVC 2 and Silverlight clients and C#.NET 4.0 programming language that would be front store. On the backend I would also like to create a desktop WPF/Windows application to create the music catalog from already existing music files, most of which have metadata in them ID3v1, ID3v2.3, ID3v2.4, iTunes MP4, WMA, Vorbis Comments and APE Tags etc. I would possibly like to create a web service that would allow catalog contributors to upload a zipped album and trigger metadata extraction of music data and extraction of music segments as described below. I would be happy if I achieve no. 1 below. Let's say I have 1000ths of songs in mp3 (or other formats) grouped in subfolders using some classification (Genre, Artists, Albums, Composers or other groupings). I want to create tables in DB that would organize songs so they can be searched based on different criteria (year, length, above classification or by song title, description etc) like what iTune store allows to their customers. I want to extract metadata from various formats (I will try to get songs in mp3 format, but there may be other popular formats) and allow music Catalog manager person to add missing data from either desktop or web applications. He or other contributors can upload zipped music via an HTML or Silverlight upload or WPF. Can anybody suggest open source libraries, articles, code snippets that can do that in an automatic way using .NET and possibly SQL Server DB? My main questions are these. This is an audio processing challenge. I want to extract 2 segments of music (questions 1 and 2): 1. How to extract a music segment: 1-2 seconds before a vocal starts singing and up to 30 seconds from that point in time and 2. Much more challenging is to find repeating segments (One would usually find or recognize the names of the songs and songs are usually known by these refrains. How would I go about creating a list of songs that go great together like what Genius from iTune does? Is there any characteristics of music that can be used to match songs? The goal is for people quickly scan and recognize songs i.e. associate melody, words with a title/album so they can make intelligent decisions like buying a song, create similar mood songs. Thanks, Rad

    Read the article

  • Creating a music catalog and extracting first 30 seconds as soon as the first words are sung

    - by Rad
    I already read a question: Separation of singing voice from music. I don’t need this complex audio processing. I only need some detection mechanism that would detect that there is some voice/vocal playing while the music is playing (or not playing) I need to extract first 30 seconds when a vocalist starts singing along with full band music. See question 2 below. I want to create a music catalog using ASP.NET MVC 2 and Silverlight clients and C#.NET 4.0 programming language that would be front store. On the backend I would also like to create a desktop WPF/Windows application to create the music catalog from already existing music files, most of which have metadata in them ID3v1, ID3v2.3, ID3v2.4, iTunes MP4, WMA, Vorbis Comments and APE Tags etc. I would possibly like to create a web service that would allow catalog contributors to upload a zipped album and trigger metadata extraction of music data and extraction of music segments as described below. I would be happy if I achieve no. 1 below. Let's say I have 1000ths of songs in mp3 (or other formats) grouped in subfolders using some classification (Genre, Artists, Albums, Composers or other groupings). I want to create tables in DB that would organize songs so they can be searched based on different criteria (year, length, above classification or by song title, description etc) like what iTune store allows to their customers. I want to extract metadata from various formats (I will try to get songs in mp3 format, but there may be other popular formats) and allow music Catalog manager person to add missing data from either desktop or web applications. He or other contributors can upload zipped music via an HTML or Silverlight upload or WPF. Can anybody suggest open source libraries, articles, code snippets that can do that in an automatic way using .NET and possibly SQL Server DB? My main questions are these. This is an audio processing challenge. I want to extract 2 segments of music (questions 1 and 2): 1. How to extract a music segment: 1-2 seconds before a vocal starts singing and up to 30 seconds from that point in time and 2. Much more challenging is to find repeating segments (One would usually find or recognize the names of the songs and songs are usually known by these refrains. How would I go about creating a list of songs that go great together like what Genius from iTune does? Is there any characteristics of music that can be used to match songs? The goal is for people quickly scan and recognize songs i.e. associate melody, words with a title/album so they can make intelligent decisions like buying a song, create similar mood songs.

    Read the article

  • The internal implementation of R's dataset

    - by Yin Zhu
    I am trying to build a data processing program. Currently I use a double matrix to represent the data table, each row is an instance, each column represents a feature. I also have an extra vector as the target value for each instance, it is of double type for regression, it is of integer for classification. I want to make it more general. I am wondering how what kind of structure R uses to store a dataset, i.e. the internal implementation in R.

    Read the article

  • Quick guide to Oracle IRM 11g: Creating your first sealed document

    - by Simon Thorpe
    Quick guide to Oracle IRM 11g indexThe previous articles in this guide have detailed how to install, configure and secure your Oracle IRM 11g service. This article walks you through the process of now creating your first context and securing a document against it. I should mention that it would be worth reviewing the following to ensure your installation is ready for that all important first document. Ensure you have correctly configured the keystore for the IRM wrapper keys. If this is not correctly configured, creating the context below will fail. Make sure the IRM server URL correctly resolves and uses the right protocol (HTTP or HTTPS) ContentsCreate the first contextInstall the Oracle IRM Desktop Seal your first document Create the first contextIn Oracle 11g there is a built in classification and rights system called the "standard rights model" which is based on 10 years of customer use cases and innovation. It is a system which enables IRM to scale massively whilst retaining the ability to balance security and usability and also separate duties by allowing contacts in the business to own classifications. The final article in this guide goes into detail on this inbuilt classification model, but for the purposes of this current article all we need to do is create at least one context to test our system out.With a new IRM server there are a set of predefined context templates and roles which again are setup in a way which reflects the most common use we've learned from our customers. We will use these out of the box configurations as they are to create the first context against which we will seal some content.First login to your Oracle IRM Management Website located at https://irm.company.com/irm_rights/. Currently the system is only configured to use the built in LDAP for users, so use the only account we have at the moment, which by default is weblogic. Once logged in switch to the Contexts tab. Click on the New Context icon () in the menu bar on the left. In the resulting dialog select the Standard context template and enter in a name for the context. Then just hit finish, the weblogic account will automatically be made the manager. You'll now see your brand new context ready for users to be assigned. Now click on the Assign Role icon () in the menu bar and in the resulting dialog search for your only user account, weblogic, and add to the list on the right. Now select a role for this user. Because we need to create a document with this user we must select contributor, as this is the only role which allows for the ability to seal. Finally hit next and then finish. We now have a context with a user that has the rights to create a document. The next step is to configure the IRM Desktop to get these rights from the server. Install the Oracle IRM Desktop Before we can seal a document we need the client software installed. Oracle IRM has a very small, lightweight client called the Oracle IRM Desktop which can be freely downloaded in 27 languages from here. Double click on the installer and click on next... Next again... And finally on install... Very easy. You may get a warning about closing Outlook, Word or another application and most of the time no reboots are required. Once it is installed you will see the IRM Desktop icon running in your tool tray, bottom right of the desktop. Seal your first document Finally the prize is within reach, creating your first sealed document. The server is running, we've got a context ready, a user assigned a role in the context but there is the simple and obvious hoop left to jump through. To seal a document we need to have the users rights cached to the local machine. For this to take place, the IRM Desktop needs to know where the Oracle IRM server is on the network so we can synchronize these rights and then be able to seal a document. The usual way for the IRM Desktop to know about the IRM server is it learns automatically when you open an existing piece of content that someone has sent you... ack. Bit of a chicken or the egg dilemma. The solution is to manually tell the IRM Desktop the location of the IRM Server and then force a synchronization of rights. Right click on the Oracle IRM Desktop icon in the system tray and select Options.... Then switch to the Servers tab in the resulting dialog. There are no servers in the list because you've never opened any content. This list is usually populated automatically but we are going to add a server manually, so click on New.... Into the dialog enter in the full URL to the IRM server. Note that this time you use the path /irm_desktop/ and not /irm_rights/. You can see an example from the image below. Click on the validate button and you'll be asked to authenticate. Enter in your weblogic username and password and also check the Remember my password check box. Click OK and the IRM Desktop will confirm a successful connection to the server. OK all the dialogs and we are ready to Synchronize this users rights to the desktop. Right click once more on the Oracle IRM Desktop icon in the system tray. Now the Synchronize menu option is available. Select this and the IRM Desktop will now talk to the IRM server, authenticate using your weblogic account and get your rights to the context we created. Because this is the first time this users has communicated with the IRM server the IRM Desktop presents a privacy policy dialog. This is a chance for the business to ask users to agree to any policy about the use of IRM before opening secured documents. In our guide we've not bothered to setup this URL so just click on the check box and hit Accept. The IRM Desktop will then talk to the server, get your rights and display a success dialog. Lets protect a documentNow we are ready to seal a piece of content. In my guide i'm going to protect a Microsoft Word document. This mean's I have to have copy of Office installed, in this guide i'm using Microsoft Office 2007. You could also seal a PDF document, you'll need to download and install Adobe Acrobat Reader. A very simple test could be to seal a GIF/JPG/PNG or piece of HTML because this is rendered using Internet Explorer. But as I say, i'm going to protect a Word document. The following example demonstrates choosing a file in Windows Explorer, there are many ways to seal a file and you can watch a few in this video.Open a copy of Windows Explorer and locate the file you wish to seal. Right click on the document and select Seal To -> Context You are now presented with the Select Context dialog. You'll now have a sealed copy of the document sat in the same location. Double click on this document and it will open, again using the credentials you've already provided. That is it, now you just need to add more users, more documents, more classifications and start exploring the different roles and experiment with different offline periods etc. You may wish to setup the server against an existing LDAP or Active Directory environment instead of using the built in WebLogic LDAP store. You can read how to use your corporate directory here. But before we finish this guide, there is one more article and arguably the most important article of all. Next I discuss the all important decision making surrounding the actually implementation of Oracle IRM inside your business. Who has rights to what? How do you map contexts to your existing business practices? It is the next article which actually ensures you deploy a successful IRM solution by looking at the business and understanding how they use your sensitive information and then configuring Oracle IRM to reflect their use.

    Read the article

  • Probation is Over: PASS Board Year 1, Q2

    - by Denise McInerney
    Though it's not always official every job begins with a probation period. You start out with lots of questions and every day you find out how much more you have to learn. Usually after a few months you discover that you can actually answer some questions and have at least an idea of what you are supposed to be doing. Now at the end of my second quarter on the "job" of serving on the PASS Board I have reached that point. My probation period is over. The last three months were busy for the entire Board with the budget process, an in-person meeting and moving forward with PASS Global Growth plans. I had also set a specific goal for myself for my 2nd quarter: to see the Board to adopt a Code of Conduct for the PASS Summit. Code of Conduct When I ran for the Board I included my desire to see PASS establish a code of conduct in my campaign platform.  I was motivated to do this for a few reasons. Other technical conferences have had incidents of harassment. Most of these did not have a policy in place prior to having a problem, though several conference organizers have since adopted anti-harassment policies or codes of conduct. I felt it would be in PASS' interest to establish a policy so we would be prepared should there be an incident.   "This is Community" Adopting a code of conduct would reinforce our community orientation and send a message about the positive character of the Summit. PASS is a leader among technical organizations for its promotion and support of women. Adopting a code of conduct would further demonstrate our leadership in this area. After researching similar polices from other organizations I published a first draft in April. I solicited feedback from the Board, HQ staff and some PASS members. Incorporating that feedback I presented version 4 at the May Board meeting, where we had a good discussion. You can read the meeting minutes for details. I incorporated points from  the Board discussion as well as feedback from a legal review to produce a final version which has been submitted to the Board. It will be discussed at the Board meeting July 12. You can read the full text at the end of this post. Virtual Chapters In the first quarter we started ramping up marketing support for the Virtual Chapters. Since then each edition of the Connector has highlighted a different VC to help get out the message about the variety of eductional opporutnities that are offered. These VC profiles will continue in the coming months. I was very pleased to welcome the new DBA Fundamentals VC which is geared toward new DBAs, people who are considering entering the field and those transitioning from a different IT role. Thanks to the contributions of Erin Stellato, Michelle Nalliah and Karla Landrum we published a "Virtual Chapter Guidebook". This document includes great advice on how to build and promote a VC. It's also a reference for how things work, from budgets to webinar hosting. I think this document will be extremely valuable to all our VC leaders and am grateful to those who put it together. Board Meeting/SQL Rally The Board met in May in Dallas. Among the items discussed were Global Growth, the budget, future events and the upcoming elections. We covered a lot of ground in two days and I will again refer you to the meeting minutes for details. The meeting schedule allowed us to participate in the SQL Rally networking events and one full day of the conference. I enjoyed having the opportunity to meet and talk with many PASS members. And my hat is off to the SQL Rally organizers who put on an outstanding event. Global Growth PASS has undertaken a major intitiative to reach and engage SQL Server professionals around the world. This Global Growth plan is ambitious and will have a significant impact on the strategic direction of the organization. We have been reaching out to the community for feedback, including hosting Twitter chats and live Town Hall meetings. I co-hosted two of these events and appreciated hearing the different perspectives of the people who participated If you have not done so I encourage you to read about the Global Growth vision and proposed governance changes  and submit your feedback. FY13 Budget July 1 is the beginning of PASS' fiscal year, which makes the end of June the deadline for approving a budget. Each director submits a budget for his or her portfolio. For the Virtual Chapter portfolio I focused on how we can allocate resources to grow the VCs. Budgeting is a give-and-take process, and while I didn't get everything I asked for I'm pleased the FY13 budget includes a significant increase in financial support for the Virtual Chapters. Many people put a lot of work into the budget, but no two people deserve credit more than VP of Finance Douglas McDowell and Accounting Manager Sandy Cherry. Thanks to both of them for getting us across the goal line on time. SQL Saturday I attended SQL Saturdays in Orange Co. CA and Phoenix. It's always inspiring to see the enthusiasm in the community for learning and networking. These events are successful due to the hard work of many volunteers. Thanks to the organizers in both cities for all your efforts. Next Up This quarter we'll be gearing up plans for the VCs at the Summit and exploring ways the VCs can best support PASS' Global Growth work. I'll also be wrapping up work on the Code of Conduct and attending a Board meeting in September. And I will be at SQL Saturday #144 in Sacramento later this month. Here is the language of the Code of Conduct I have submitted to the Board for consideration: PASS Code of Conduct The PASS Summit provides database professionals from a variety of backgrounds with an opportunity to connect, share and learn.  We value the strong sense of community that characterizes this event and we seek to foster an inclusive, professional atmosphere. We are dedicated to providing a harassment-free conference experience for everyone, regardless of gender, race, sexual orientation, disability, physical appearance, religion or any other protected classification.  Everyone at the Summit is expected to follow the Code of Conduct. This includes but is not limited to: PASS Staff, Exhibitors, Speakers, Attendees and anyone affiliated with the event. Participants are expected to follow the Code of Conduct at all Summit events, including PASS-sponsored social events. Participant behavior Harassment includes, but is not limited to, offensive verbal comments related to gender, race, sexual orientation, disability, physical appearance, religion, or any other protected classification.  Intimidation, threats, stalking, harassing photography or recording, sustained disruption of talks or other events, inappropriate physical contact and unwelcome attention will also be considered harassment. Similarly, sexual, racist, derogatory, threatening or other inappropriate language and imagery are not appropriate for any conference venue, including sessions.  Recourse If a participant engages in any conduct that is prohibited under this Code of Conduct, the conference organizers may take any action they deem appropriate, including warning the offender or expelling the offender from the conference. No refunds will be granted to attendees expelled from the Summit due to violations of the Code of Conduct. If you are being harassed, witness harassment, or have any other concerns, please contact a member of conference staff immediately. Conference staff can be identified by their “Headquarters/Staff” shirts and are trained to handle the situation appropriately. A Code of Conduct Committee (CCC) made up of the Executive Manager and three members of the Board of Directors designated by the President will be authorized to take action in response to an incident or behavior that violates the Code of Conduct.

    Read the article

  • Rsyslog mail module not working

    - by Henry-Nicolas Tourneur
    Hi *, I would like to email snort alerts from my Debian Lenny fw. Syslog is sending log messages from the firewalls to a central rsyslog. On my central rsyslog, I got something like : $ModLoad ommail $ActionMailSMTPServer server.company.local $ActionMailFrom [email protected] $ActionMailTo [email protected] $ActionExecOnlyOnceEveryInterval 1 $template mailSubject,"[SNORT] Alert from %hostname%" $template mailBody,"Snort message\r\nmsg='%msg%'" $ActionMailSubject mailSubject if $msg regexp 'snort[[0-9]]: [[0-9]:[0-9]:[0-9]].*' then ommail:;mailBody But I doesn't get any mails, I even can trigger snort with something like ping -s 1400, it logs things like following but still no mail ! 2010-01-08T09:25:58+00:00 Hostname snort[4429]: [1:499:4] ICMP Large ICMP Packet [Classification: Potentially Bad Traffic] [Priority: 2]: {ICMP} ip_dest - ip_src Any idea ?

    Read the article

  • Tomato QoS: Why is some traffic unclassified when there are classifications for it?

    - by Armitage
    Ok, I am trying to tweak my router to give priority to some traffic. My classifications seem to cover just about everything but I still see ~60 to ~80% of the traffic as unclassified: TCP 192.168.1.100 64137 192.168.1.1 80 Unclassified TCP 192.168.1.100 64175 192.168.1.1 80 Unclassified TCP 192.168.1.100 64144 192.168.1.1 443 Unclassified I assume that the 64### ports are just what my WAP uses to send packets inside my home network. But my classifications seems to cover any traffic for destination ports 80 and 443: (partial list) TCP Dst Port: 80,443 High WWW TCP/UDP Dst Port: 1024-65535 Lowest Bulk Traffic Why do I have so much unclassified traffic if I have a classification that should cover it?

    Read the article

  • Rsyslog mail module not working

    - by Henry-Nicolas Tourneur
    I would like to email snort alerts from my Debian Lenny fw. Syslog is sending log messages from the firewalls to a central rsyslog. On my central rsyslog, I got something like : $ModLoad ommail $ActionMailSMTPServer server.company.local $ActionMailFrom [email protected] $ActionMailTo [email protected] $ActionExecOnlyOnceEveryInterval 1 $template mailSubject,"[SNORT] Alert from %hostname%" $template mailBody,"Snort message\r\nmsg='%msg%'" $ActionMailSubject mailSubject if $msg regexp 'snort[[0-9]]: [[0-9]:[0-9]:[0-9]].*' then ommail:;mailBody But I doesn't get any mails, I even can trigger snort with something like ping -s 1400, it logs things like following but still no mail ! 2010-01-08T09:25:58+00:00 Hostname snort[4429]: [1:499:4] ICMP Large ICMP Packet [Classification: Potentially Bad Traffic] [Priority: 2]: {ICMP} ip_dest - ip_src Any idea ?

    Read the article

  • What is good RAM for my CPU?

    - by Jason94
    I'm going to upgrade my RAM, but I have no clue what to look for. There is cheap and there is expensive RAM. What should I look for? What are good attributes for intel i5? The i5 CPU and motherboeard supports 1600 MHz (PC3-12800), but there is still to much to choose from. There are CL10 (10-10-10-27) @ 1.5V CL11 @ 1.35V CL9 (9-9-9-24) @ 1.5V CL9 (9-9-9-24) @ 1.65V Higher CL is better? CL is classification? And I guess a high voltage is also good since thats what overclockers do to the ram (increase the voltage)?

    Read the article

  • When should I use Areas in TFS instead of Team Projects

    - by Martin Hinshelwood
    Well, it depends…. If you are a small company that creates a finite number of internal projects then you will find it easier to create a single project for each of your products and have TFS do the heavy lifting with reporting, SharePoint sites and Version Control. But what if you are not… Update 9th March 2010 Michael Fourie gave me some feedback which I have integrated. Ed Blankenship via @edblankenship offered encouragement and a nice quote. Ewald Hofman gave me a couple of Cons, and maybe a few more soon. Ewald’s company, Avanade, currently uses Areas, but it looks like the manual management is getting too much and the project is getting cluttered. What if you are likely to have hundreds of projects, possibly with a multitude of internal and external projects? You might have 1 project for a customer or 10. This is the situation that most consultancies find themselves in and thus they need a more sustainable and maintainable option. What I am advocating is that we should have 1 “Team Project” per customer, and use areas to create “sub projects” within that single “Team Project”. "What you describe is what we generally do internally and what we recommend. We make very heavy use of area path to categorize the work within a larger project." - Brian Harry, Microsoft Technical Fellow & Product Unit Manager for Team Foundation Server   "We tend to use areas to segregate multiple projects in the same team project and it works well." - Tiago Pascoal, Visual Studio ALM MVP   "In general, I believe this approach provides consistency [to multi-product engagements] and lowers the administration and maintenance costs. All good." - Michael Fourie, Visual Studio ALM MVP   “@MrHinsh BTW, I'm very much a fan of very large, if not huge, team projects in TFS. Just FYI :) Use Areas & Iterations.” Ed Blankenship, Visual Studio ALM MVP   This would mean that SSW would have a single Team Project called “SSW” that contains all of our internal projects and consequently all of the Areas and Iteration move down one hierarchy to accommodate this. Where we would have had “\SSW\Sprint 1” we now have “\SSW\SqlDeploy\Sprint1” with “SqlDeploy” being our internal project. At the moment SSW has over 70 internal projects and more than 170 total projects in TFS. This method has long term benefits that help to simplify the support model for companies that often have limited internal support time and many projects. But, there are implications as TFS does not provide this model “out-of-the-box”. These implications stretch across Areas, Iterations, Queries, Project Portal and Version Control. Michael made a good comment, he said: I agree with your approach, assuming that in a multi-product engagement with a client, they are happy to adopt the same process template across all products. If they are not, then it’ll either be easy to convince them or there is a valid reason for having a different template - Michael Fourie, Visual Studio ALM MVP   At SSW we have a standard template that we use and this is applied across the board, to all of our projects. We even apply any changes to the core process template to all of our existing projects as well. If you have multiple projects for the same clients on multiple templates and you want to keep it that way, then this approach will not work for you. However, if you want to standardise as we have at SSW then this approach may benefit you as well. Implications around Areas Areas should be used for topological classification/isolation of work items. You can think of this as architecture areas, organisational areas or even the main features of your application. In our scenario there is an additional top level item that represents the Project / Product that we want to chop our Team Project into. Figure: Creating a sub area to represent a product/project is easy. <teamproject> <teamproject>\<Functional Area/module whatever> Becomes: <teamproject> <teamproject>\<ProjectName>\ <teamproject>\<ProjectName>\<Functional Area/module whatever> Implications around Iterations Iterations should be used for chronological classification/isolation of work items. This could include isolated time boxes, milestones or release timelines and really depends on the logical flow of your project or projects. Due to the new level in Area we need to add the same level to Iteration. This is primarily because it is unlikely that the sprints in each of your projects/products will start and end at the same time. This is just a reality of managing multiple projects. Figure: Adding the same Area value to Iteration as the top level item adds flexibility to Iteration. <teamproject>\Sprint 1 Or <teamproject>\Release 1\Sprint 1 Becomes: <teamproject>\<ProjectName>\Sprint 1 Or <teamproject>\<ProjectName>\Release 1\Sprint 1 Implications around Queries Queries are used to filter your work items based on a specified level of granularity. There are a number of queries that are built into a project created using the MSF Agile 5.0 template, but we now have multiple projects and it would be a pain to have to edit all of the work items every time we changed project, and that would only allow one team to work on one project at a time.   Figure: The Queries that are created in a normal MSF Agile 5.0 project do not quite suit our new needs. In order for project contributors to be able to query based on their project we need a couple of things. The first thing I did was to create an “_Area Template” folder that has a copy of the project layout with all the queries setup to filter based on the “_Area Template” Area and the “_Sprint template” you can see in the Area and Iteration views. Figure: The template is currently easily drag and drop, but you then need to edit the queries to point at the right Area and Iteration. This needs a tool. I then created an “Areas” folder to hold all of the area specific queries. So, when you go to create a new TFS Sub-Project you just drag “_Area Template” while holding “Ctrl” and drop it onto “Areas”. There is a little setup here. That said I managed it in around 10 minutes which is not so bad, and I can imagine it being quite easy to build a tool to create these queries Figure: These new queries can be configured in around 10 minutes, which includes setting up the Area and Iteration as well. Version Control What about your source code? Well, that is the easiest of the lot. Just create a sub folder for each of your projects/products.   Figure: Creating sub folders in source control is easy as “Right click | Create new folder”. <teamproject>\DEV\Main\ Becomes: <teamproject>\<ProjectName>\DEV\Main\ Conclusion I think it is up to each company to make a call on how you want to configure your Team Projects and it depends completely on how many projects/products you are going to have for each customer including yourself. If we decide to utilise this route it will require some configuration to get our 170+ projects into this format, and I will probably be writing some tools to help. Pros You only have one project to upgrade when a process template changes – After going through an upgrade of over 170 project prior to the changes in the RC I can tell you that that many projects is no fun. Standardises your Process Template – You will always have the same Process implementation across projects/products without exception You get tighter control over the permissions – Yes, you can do this on a standard Team Project, but it gets a lot easier with practice. You can “move” work items from one “product” to another – Have we not always wanted to do that. You can rename your projects – Wahoo: everyone wants to do this, now you can. One set of Reporting Services reports to manage – You set an area and iteration to run reports anyway, so you may as well set both. Simplified Check-In Policies– There is only one set of check-in policies per client. This simplifies administration of policies. Simplified Alerts – As alerts are applied across multiple projects this simplifies your alert rules as per client. Cons All of these cons could be mitigated by a custom tool that helps automate creation of “Sub-projects” within Team Projects. This custom tool could create areas, Iteration, permissions, SharePoint and queries. It just does not exist yet :) You need to configure the Areas and Iterations You need to configure the permissions You may need to configure sub sites for SharePoint (depends on your requirement) – If you have two projects/products in the same Team Project then you will not see the burn down for each one out-of-the-box, but rather a cumulative for the Team Project. This is not really that much of a problem as you would have to configure your burndown graphs for your current iteration anyway. note: When you create a sub site to a TFS linked portal it will inherit the settings of its parent site :) This is fantastic as it means that you can easily create sub sites and then set the Area and Iteration path in each of the reports to be the correct one. Every team wants their own customization (via Ewald Hofman) - small teams of 2 persons against teams of 30 – or even outsourcing – need their own process, you cannot allow that because everybody gets the same work item types. note: Luckily at SSW this is not a problem as our template is standardised across all projects and customers. Large list of builds (via Ewald Hofman) – As the build list in Team Explorer is just a flat list it can get very cluttered. note: I would mitigate this by removing any build that has not been run in over 30 days. The build template and workflow will still be available in version control, but it will clean the list. Feedback Now that I have explained this method, what do you think? What other pros and cons can you see? What do you think of this approach? Will you be using it? What tools would you like to support you?   Technorati Tags: Visual Studio ALM,TFS Administration,TFS,Team Foundation Server,Project Planning,TFS Customisation

    Read the article

  • C# Neural Networks with Encog

    - by JoshReuben
    Neural Networks ·       I recently read a book Introduction to Neural Networks for C# , by Jeff Heaton. http://www.amazon.com/Introduction-Neural-Networks-C-2nd/dp/1604390093/ref=sr_1_2?ie=UTF8&s=books&qid=1296821004&sr=8-2-spell. Not the 1st ANN book I've perused, but a nice revision.   ·       Artificial Neural Networks (ANNs) are a mechanism of machine learning – see http://en.wikipedia.org/wiki/Artificial_neural_network , http://en.wikipedia.org/wiki/Category:Machine_learning ·       Problems Not Suited to a Neural Network Solution- Programs that are easily written out as flowcharts consisting of well-defined steps, program logic that is unlikely to change, problems in which you must know exactly how the solution was derived. ·       Problems Suited to a Neural Network – pattern recognition, classification, series prediction, and data mining. Pattern recognition - network attempts to determine if the input data matches a pattern that it has been trained to recognize. Classification - take input samples and classify them into fuzzy groups. ·       As far as machine learning approaches go, I thing SVMs are superior (see http://en.wikipedia.org/wiki/Support_vector_machine ) - a neural network has certain disadvantages in comparison: an ANN can be overtrained, different training sets can produce non-deterministic weights and it is not possible to discern the underlying decision function of an ANN from its weight matrix – they are black box. ·       In this post, I'm not going to go into internals (believe me I know them). An autoassociative network (e.g. a Hopfield network) will echo back a pattern if it is recognized. ·       Under the hood, there is very little maths. In a nutshell - Some simple matrix operations occur during training: the input array is processed (normalized into bipolar values of 1, -1) - transposed from input column vector into a row vector, these are subject to matrix multiplication and then subtraction of the identity matrix to get a contribution matrix. The dot product is taken against the weight matrix to yield a boolean match result. For backpropogation training, a derivative function is required. In learning, hill climbing mechanisms such as Genetic Algorithms and Simulated Annealing are used to escape local minima. For unsupervised training, such as found in Self Organizing Maps used for OCR, Hebbs rule is applied. ·       The purpose of this post is not to mire you in technical and conceptual details, but to show you how to leverage neural networks via an abstraction API - Encog   Encog ·       Encog is a neural network API ·       Links to Encog: http://www.encog.org , http://www.heatonresearch.com/encog, http://www.heatonresearch.com/forum ·       Encog requires .Net 3.5 or higher – there is also a Silverlight version. Third-Party Libraries – log4net and nunit. ·       Encog supports feedforward, recurrent, self-organizing maps, radial basis function and Hopfield neural networks. ·       Encog neural networks, and related data, can be stored in .EG XML files. ·       Encog Workbench allows you to edit, train and visualize neural networks. The Encog Workbench can generate code. Synapses and layers ·       the primary building blocks - Almost every neural network will have, at a minimum, an input and output layer. In some cases, the same layer will function as both input and output layer. ·       To adapt a problem to a neural network, you must determine how to feed the problem into the input layer of a neural network, and receive the solution through the output layer of a neural network. ·       The Input Layer - For each input neuron, one double value is stored. An array is passed as input to a layer. Encog uses the interface INeuralData to hold these arrays. The class BasicNeuralData implements the INeuralData interface. Once the neural network processes the input, an INeuralData based class will be returned from the neural network's output layer. ·       convert a double array into an INeuralData object : INeuralData data = new BasicNeuralData(= new double[10]); ·       the Output Layer- The neural network outputs an array of doubles, wraped in a class based on the INeuralData interface. ·        The real power of a neural network comes from its pattern recognition capabilities. The neural network should be able to produce the desired output even if the input has been slightly distorted. ·       Hidden Layers– optional. between the input and output layers. very much a “black box”. If the structure of the hidden layer is too simple it may not learn the problem. If the structure is too complex, it will learn the problem but will be very slow to train and execute. Some neural networks have no hidden layers. The input layer may be directly connected to the output layer. Further, some neural networks have only a single layer. A single layer neural network has the single layer self-connected. ·       connections, called synapses, contain individual weight matrixes. These values are changed as the neural network learns. Constructing a Neural Network ·       the XOR operator is a frequent “first example” -the “Hello World” application for neural networks. ·       The XOR Operator- only returns true when both inputs differ. 0 XOR 0 = 0 1 XOR 0 = 1 0 XOR 1 = 1 1 XOR 1 = 0 ·       Structuring a Neural Network for XOR  - two inputs to the XOR operator and one output. ·       input: 0.0,0.0 1.0,0.0 0.0,1.0 1.0,1.0 ·       Expected output: 0.0 1.0 1.0 0.0 ·       A Perceptron - a simple feedforward neural network to learn the XOR operator. ·       Because the XOR operator has two inputs and one output, the neural network will follow suit. Additionally, the neural network will have a single hidden layer, with two neurons to help process the data. The choice for 2 neurons in the hidden layer is arbitrary, and often comes down to trial and error. ·       Neuron Diagram for the XOR Network ·       ·       The Encog workbench displays neural networks on a layer-by-layer basis. ·       Encog Layer Diagram for the XOR Network:   ·       Create a BasicNetwork - Three layers are added to this network. the FinalizeStructure method must be called to inform the network that no more layers are to be added. The call to Reset randomizes the weights in the connections between these layers. var network = new BasicNetwork(); network.AddLayer(new BasicLayer(2)); network.AddLayer(new BasicLayer(2)); network.AddLayer(new BasicLayer(1)); network.Structure.FinalizeStructure(); network.Reset(); ·       Neural networks frequently start with a random weight matrix. This provides a starting point for the training methods. These random values will be tested and refined into an acceptable solution. However, sometimes the initial random values are too far off. Sometimes it may be necessary to reset the weights again, if training is ineffective. These weights make up the long-term memory of the neural network. Additionally, some layers have threshold values that also contribute to the long-term memory of the neural network. Some neural networks also contain context layers, which give the neural network a short-term memory as well. The neural network learns by modifying these weight and threshold values. ·       Now that the neural network has been created, it must be trained. Training a Neural Network ·       construct a INeuralDataSet object - contains the input array and the expected output array (of corresponding range). Even though there is only one output value, we must still use a two-dimensional array to represent the output. public static double[][] XOR_INPUT ={ new double[2] { 0.0, 0.0 }, new double[2] { 1.0, 0.0 }, new double[2] { 0.0, 1.0 }, new double[2] { 1.0, 1.0 } };   public static double[][] XOR_IDEAL = { new double[1] { 0.0 }, new double[1] { 1.0 }, new double[1] { 1.0 }, new double[1] { 0.0 } };   INeuralDataSet trainingSet = new BasicNeuralDataSet(XOR_INPUT, XOR_IDEAL); ·       Training is the process where the neural network's weights are adjusted to better produce the expected output. Training will continue for many iterations, until the error rate of the network is below an acceptable level. Encog supports many different types of training. Resilient Propagation (RPROP) - general-purpose training algorithm. All training classes implement the ITrain interface. The RPROP algorithm is implemented by the ResilientPropagation class. Training the neural network involves calling the Iteration method on the ITrain class until the error is below a specific value. The code loops through as many iterations, or epochs, as it takes to get the error rate for the neural network to be below 1%. Once the neural network has been trained, it is ready for use. ITrain train = new ResilientPropagation(network, trainingSet);   for (int epoch=0; epoch < 10000; epoch++) { train.Iteration(); Debug.Print("Epoch #" + epoch + " Error:" + train.Error); if (train.Error > 0.01) break; } Executing a Neural Network ·       Call the Compute method on the BasicNetwork class. Console.WriteLine("Neural Network Results:"); foreach (INeuralDataPair pair in trainingSet) { INeuralData output = network.Compute(pair.Input); Console.WriteLine(pair.Input[0] + "," + pair.Input[1] + ", actual=" + output[0] + ",ideal=" + pair.Ideal[0]); } ·       The Compute method accepts an INeuralData class and also returns a INeuralData object. Neural Network Results: 0.0,0.0, actual=0.002782538818034049,ideal=0.0 1.0,0.0, actual=0.9903741937121177,ideal=1.0 0.0,1.0, actual=0.9836807956566187,ideal=1.0 1.0,1.0, actual=0.0011646072586172778,ideal=0.0 ·       the network has not been trained to give the exact results. This is normal. Because the network was trained to 1% error, each of the results will also be within generally 1% of the expected value.

    Read the article

  • CodePlex Daily Summary for Thursday, May 13, 2010

    CodePlex Daily Summary for Thursday, May 13, 2010New ProjectsC# 4.0: VS 2010Collaborative Rich Text Edit Prototype Silverlight on Windows Azure: Working like Google Doc but based on Silverlight/Windows Azure. Multiple users can edit one rich text document simultaneously. This is a POC and my...Consulting Assigment: Proyecto de consultoría para el curso Gestión de proyectos de la UCRCS 105 MP Splitter: Simple program to help with splitting up MPs for grading.DipEngine - graphic engine written on Xen for XNA: DIPEngine makes it easier for developers to use it in your game. It's developed in C# and using Xen for XNADotNetAgent: Agent FrameworkExtending C# editor - Outlining, classification: Extensions to VS C# editor current feautereset. Outlining for switch, for(each), if etc. statements. Classification types for method parameters and...Floe IRC Client: Floe is an IRC client written entirely in C#, using the .NET 4.0 framework with WPF technology. It is inspired largely by both XChat and mIRC, and ...Gestalt.Net: Gestalt.Net is a simple to use library for application configuration. It allows for more flexibility, easier to use, and more powerful than the app...GIM.OnlineChess: An implementation of online long-term chessGoogleTrail: Create and Export using the google map to develop the Trek Trails and Export Those Trails as Garmin Custom Map Compatible MapsGrassidi: What could Grassidi be?HodsAudio: Personal web based music library.Hospital Manage System: Free Hospital Manage SystemHtml Reader: Html Reader makes it easier for WPF WebBrowser users to access the Document property, and retrieve information from HTML pages. You'll no longer ha...Intellitouch.BackOffice: This project is a control for a menu.Jank: JankKooBoo Image Galery: It's a Module for kooboo that implements an image galery admin and view Developed for kooboo 2.1.1.0MoonyDesk (windows desktop widgets): Windows desktop plugin based widgets system written on WPFMSNSharp Application with Video Conferencing Feature: In MSN World, users don’t communicate directly each other.I examined P2P open source video/audio conference systems . After researching, I found Vi...mysample: study sample siteNoteFlyBot: An Intelligent Note Program is: - a 'Post-it' note that accept sNatural Language command to Upload content to EMC Repositories - Start Workflow...OpenGraphNET: A OpenGraphNET is a simple parser for the Open Graph protocol created by Facebook. More information on this can be found at opengraphprotocol.org.POCO - ADO.NET Entity Framework 4.0: The sample shows how to retrieve the data using POCO using EDM 4.0. Database used: Northwind (http://www.microsoft.com/downloads/details.aspx?Fa...sqlmx: sqlmxUpgrade Log Parser for SharePoint 2010: This is a simple Upgrade Log Parser for SharePoint 2010. It will add a node in the upgrade section of the central administration and deploy a pag...New ReleasesAMP (Adamo Media Player): AMP (Adamo Media Player): First Release version 0.1.0 Installation Instructions Unzip the package Run Setup.exe Follow Setup instructions File includes .NET Framework ...BeanProxy: BeanProxy 2.8: BeanProxy is a C# (.NET 3.5) library housing classes that facilitates unit testing. Any non-static, public interface/class/or abstract class can be...Begtostudy-Test: Test V1: Download to testCBM-Command: 2010-05-13: Release Notes - 2010-05-13New Features Configuration Manager is complete Changes Had to put CBM-Command on a big-time diet. Ran out of room on the...CS 105 MP Splitter: CS 105 MP Splitter: Uses SharpZipLib for unzipping (http://www.icsharpcode.net/OpenSource/SharpZipLib/) Uses ILMerge to merge it all into a single executable (http://w...DipEngine - graphic engine written on Xen for XNA: DipGUI: Provides simple controls for use itEvent Scavenger: Collector service update - version 3.2.3: Added additional functionality to check if host/machine is available on the network before trying to open Event log. Also added code to try to use ...ExcelExportLib: ExcelExportLib 1.4.0.0: Solution converted to Visual Studio 2010 - Added support for formulas in cells - Added support for cell naming - Added support for cell's comments...F# Project Extender: V0.9.1.0 (VS2008,VS2010): F# project extender for Visual Studio 2008 and Visual Studio 2010. What's new : Now supports both Visual Studio 2010 and Visual Studio 2008 Fixed...Floe IRC Client: Floe IRC Client 2010-05: Initial release. Note: The .NET Framework v4.0 is required to run this app. It can be downloaded here: http://www.microsoft.com/downloads/details....Fluent Assertions: Release 1.2.1: This is small release with two improvements: You can now use the Enumeration() extension method on a Func<IEnumerable> before calling ShouldThrow<T...Gestalt.Net: Gestalt.Net 1.0: Initial release of Gestalt.Net, a simple to use library for application configuration. It allows for more flexibility, easier to use, and more powe...GPdotNET - Tree Based Genetic Programming Tool: GPdotNETv0.9: This is the same version as previous, but compiled with Visual Studio 2010 and .NET 4.0. You no longer need CTP of ParallelFX.HKGolden Express: HKGoldenExpress (Build 201005122025): New features: (None) Bug fix: (None) Improvements: HKGolden Express now parse JSON data rather than XML document from HKGolden API. This shoul...Jobping Url Shortener: Deploy Code 0.3: Contains only the files for running 0.3 version of the code.Jobping Url Shortener: Source Code 0.3: Feature added: Restriction placed on the domains that the shortener will shorten. Our installation will only shorten www.jobping.com urls. Bug Fix...KooBoo Image Galery: Beta 1: This is the first release and it is divided in some parts Module Install - Is the admin module and an default view Plugin - returns the galery as...LinkedIn® for Windows Mobile: LinkedIn for Windows Mobile v0.8.5: Changes in release 0.8.5 Allow for multiple updatetypes. Add search by Company. Show profiles of all people in an update .MDownloader: MDownloader-0.15.13.58780: Fixed Uploading.com dead links detection; Fixed links detection at pages and in FilesTube results; Fixed RSS channel checking; Enabled by def...Microsoft - Domain Oriented N-Layered .NET 4.0 App Sample (Microsoft Spain): V0.8 - N-Layer DDD Sample App (VS.2010 RTM compat): Required Software (Microsoft Base Software needed for Development environment) Visual Studio 2010 RTM & .NET 4.0 RTM (Final Versions) Unity Applic...MoonyDesk (windows desktop widgets): MoonyDesk: MoonyDesk alpha releaseMSNSharp Application with Video Conferencing Feature: MSNVideoChat: Msn Video Chat Application's exe and screenshot is attached.NazTek.Extension.Clr35: NazTek.Extension.Clr35 Binary: FxCop compliant codebaseNazTek.Extension.Clr35: NazTek.Extension.Clr35 Signed Binary: Signed binaryNazTek.Extension.Clr35: NazTek.Extension.Clr35 Signed Source: Signed sourceNazTek.Extension.Clr35: NazTek.Extension.Clr35 Source: FxCop compliant codebasepatterns & practices Web Client Developer Guidance: Web Client Software Factory 2010 Guide: If the right-side pane of the chm file is not displayed correctly, do the following: 1) Download WCSF2010Guide.chm file. 2) Start the windows explo...Powershell Scripts for Admins: BizTalk PowerShell Module: BizTalk PowerShell Module Available commands : Get-Applications Start-Application Stop-Application Get-HostInstances Start-HostInstance Stop-Hos...Reusable Library: V1.0.8: A collection of reusable abstractions for enterprise application developerReusable Library Demo: V1.0.6: A demonstration of reusable abstractions for enterprise application developerRobot Shootans: Robot Shootans 0.9: This is the second public release of the game. All instructions for play are in the game itself. Known issues: Bullet collision is still a little...Rx Contrib: V1.2: - Bug fix - API changes public static ReactiveQueue<T> CreateConcurrentQueue( ConcurrentPublicationBehavior concurrentPublicationBeha...SqlDiffFramework-A Visual Differencing Engine for Dissimilar Data Sources: SqlDiffFramework 1.0.1.0: Maintenance Release An embedded user control inadvertently assumed US regional and language settings; with non-US settings the application crashed ...Transcriber: Transcriber V0.5.0: Pre-releaseUpgrade Log Parser for SharePoint 2010: Upgrade Log Parser for SharePoint 2010: Introduction I did an upgrade from MOSS 2007 to SharePoint 2010 and SP2010 generated such a large log file and I really didn't feel like scrolling ...VCC: Latest build, v2.1.30512.0: Automatic drop of latest buildVisual Leak Detector for Visual C++ 2008/2010: v2.0a: Renamed vld dll files. Problem with MSVC 2010 Unicode library fixed.VsTortoise - a TortoiseSVN add-in for Microsoft Visual Studio: VsTortoise Build 24 Beta: Build 24 (beta) New: Added "Open Modified File..." to Solution Explorer context menus. VsTortoise.SolutionExplorerSelectedItemsOpenModifiedFile com...WatchersNET.TagCloud: WatchersNET.TagCloud 01.05.00: Whats New Custom Tags: Tag Name and Tag Url can be localized New Option to set Start Date to grab Search Referrals from New Option to Choose th...Wavelet analysis: Wavelet analysis: Первая публичная версия проекта.XNA Shooter Engine: ModelViewer Alpha 1 Preview: ModelViewer is a utility for previewing HLSL shaders and XNA BasicEffects using the XSE rendering engine. It can also be used with the Microsoft XN...Yet another developer blog - Examples: XML and XSLT transformation in ASP.NET MVC example: This sample application shows how to perform XSL transformation of XML file in ASP.NET MVC (using dedicated HtmlHelper extensions or ActionResult)....Most Popular ProjectsWBFS ManagerRawrAJAX Control ToolkitMicrosoft SQL Server Product Samples: DatabaseSilverlight ToolkitWindows Presentation Foundation (WPF)patterns & practices – Enterprise LibraryMicrosoft SQL Server Community & SamplesPHPExcelASP.NETMost Active Projectspatterns & practices – Enterprise LibraryMirror Testing SystemRawrBlogEngine.NETPHPExcelwhitejQuery Library for SharePoint Web ServicesMicrosoft Biology FoundationFarseer Physics EngineShake - C# Make

    Read the article

< Previous Page | 2 3 4 5 6 7 8 9 10  | Next Page >