Search Results

Search found 11922 results on 477 pages for 'odi solutions and news'.

Page 98/477 | < Previous Page | 94 95 96 97 98 99 100 101 102 103 104 105  | Next Page >

  • Problems updating a textBox ASP.NET

    - by Roger Filipe
    Hello, I'm starting in asp.net and am having some problems that I do not understand. The problem is this, I am building a site for news. Every news has a title and body. I have a page where I can insert news, this page uses a textbox for each of the fields (title and body), after clicking the submit button everything goes ok and saves the values in the database. And o have another page where I can read the news, I use labels for each of the camps, these labels are defined in the Page_Load. Now I'm having problems on the page where I can edit the news. I am loading two textboxes (title and body) in the Page_Load, so far so good, but then when I change the text and I click the submit button, it ignores the changes that I made in the text and saves the text loaded in Page_Load. This code doesn't show any database connection but you can understand what i'm talking about. protected void Page_Load(object sender, EventArgs e) { textboxTitle.Text = "This is the title of the news"; textboxBody.Text = "This is the body of the news "; } I load the page, make the changes in the text , and then click submit. protected void btnSubmit_Click(object sender, EventArgs e) { String title = textboxTitle.Text; String body = textboxBody.Text; Response.Write("Title: " + title + " || "); Response.Write("Body: " + body ); } Nothing happens, the text in the textboxes is always the one I loaded in the page_load, how do I update the Text in the textboxes?

    Read the article

  • Jquery adding and removing class dynamically

    - by user244394
    I am trying to add the class"selected" when a link is clicked and when the user click on the next link , I want to remove the previously "selected" class and add "selected" to the link clicked.. -Thanks in advance $(document).ready(function() { $('.news a').click(function(){ $(this).addClass("selected"); }); }); <div class="news-w"> <div class="news" id="getnews-1"> <a href="#" >topic</a> </div> <div class="news" id="getnews-2"> <a href="#">topic</a> </div> <div class="news" id="getnews-3"> <a href="#" >topic</a> </div> <div class="news" id="getnews-4"> <a href="#">topic</a> </div> <div class="news" id="getnews-5"> <a href="#">topic</a> </div> </div>

    Read the article

  • What are provenly scalable data persistence solutions for consumer profiles?

    - by Hubbard
    Consumer profiles with analytical scores [ConsumerID, 1..n demographical variables, 1...n analytical scores e.g. "likely to churn" "likely to buy an item 100$ in worth" etc.] have to be possible to query fast if they are to be used in customizing web-sites, consumer communications etc. Well. If you have: Large number of consumers Large profiles with a huge set of variables (as profiles describing human behaviour are likely to be..) ...you are in trouble. If you really have a physical relational database to which you target a query and then a physical disk starts to rotate someplace to give you an individual profile or a set of profiles, the profile user (a web site customizing a page, a recommendation engine making a recommendation..) has died of boredom before getting any observable results. There is the possibility of having the profiles in memory, which would of course increase the performance hugely. What are the most proven solutions for a fast-response, scalable consumer profile storage? Is there a shootout of these someplace?

    Read the article

  • Conceptual data modeling: Is RDF the right tool? Other solutions?

    - by paprika
    I'm planning a system that combines various data sources and lets users do simple queries on these. A part of the system needs to act as an abstraction layer that knows all connected data sources: the user shouldn't [need to] know about the underlying data "providers". A data provider could be anything: a relational DBMS, a bug tracking system, ..., a weather station. They are hooked up to the query system through a common API that defines how to "offer" data. The type of queries a certain data provider understands is given by its "offer" (e.g. I know these entities, I can give you aggregates of type X for relationship Y, ...). My concern right now is the unification of the data: the various data providers need to agree on a common vocabulary (e.g. the name of the entity "customer" could vary across different systems). Thus, defining a high level representation of the entities and their relationships is required. So far I have the following requirements: I need to be able to define objects and their properties/attributes. Further, arbitrary relations between these objects need to be represented: a verb that defines the nature of the relation (e.g. "knows"), the multiplicity (e.g. 1:n) and the direction/navigability of the relation. It occurs to me that RDF is a viable option, but is it "the right tool" for this job? What other solutions/frameworks do exist for semantic data modeling that have a machine readable representation and why are they better suited for this task? I'm grateful for every opinion and pointer to helpful resources.

    Read the article

  • What db fits me?

    - by afvasd
    Dear Everyone I am currently using mysql. I am finding that my schema is getting incredibly complicated. I seek to find a new db that will suit my needs: Let's assume I am building a news aggregrator (which collects news from multiple website). I then run algorithms to determine if two news from different sites are actually referring to the same topic. I run this algorithm to cluster news together. The relationship is depicted below: cluster \--news1 \--word1 \--word2 \--news2 \--word3 \--news3 \--word1 \--word3 And then I will apply some magic and determine the importance of each word. Summing all the importance of each word gives me the importance of a news article. Summing the importance of each news article gives me the importance of a cluster. Note that above cluster there are also subgroups( like split by region etc), and categories (like sports, etc) which I have to determine the importance of that in a particular day per se. I have used views in the past to do so, but I realized that views are very slow. So i will normally do an insert into an actual table and index them for better performance. As you can see this leads to multiple tables derived like (cluster, importance), (news, importance), (words, importance) etc which can get pretty messy. Also the "importance" metric will change. It has become increasingly difficult to alter tables, update data (which I am using TRUNCATE TABLE) and then inserting from null. I am currently looking into something schemaless like Mongodb. I do not need distributedness. I would very much want something that is reasonably fast (which can be indexed) and something that is a lot more flexible that traditional RDMBS. Also, I need something that has some kind of ORM because I personally like ORM a lot. I am currently using sqlalchemy Please help!

    Read the article

  • Nginx rewrite for link shortener + Wordpress pretty URLs

    - by detusueno
    Okay so I installed Nginx/PHP/MySQL/Wordpress via a online walk through, and it had me enter these rewrites to enable Wordpress pretty URLs: if (-f $request_filename) { break; } if (-d $request_filename) { break; } rewrite ^(.+)$ /index.php?q=$1 last; error_page 404 = //index.php?q=$uri; This is then included in the vhost for my domain. What I'm trying to do now is add some redirection/link shortner rewrites that will play nice with the setup I have in mind. I'd like to redirect "x.com/y" to "x.com/script.php?id=y" for all external links that I post. The Wordpress link setup right now has almost all internal links begin with "news" (x.com/news/post-blah, x.com/news/category/1, etc) BUT I also have a few root links that point to some internal content (x.com/news, x.com/start). I'm guessing that's going to cause some conflicts. What's the best approach to do this? I've never worked with Nginx (or any rewrite rules) but maybe I can distinguish between "x.com/news" and "x.com/news/" to allow it to play nice? I had a friend setup a working version of this in Apache and it'd be nice if I could get this up on Nginx again.

    Read the article

  • Applying ACLs to a Dovecot public namespace

    - by larsks
    I have a public namespace define in my dovecot (dovecot-2.0.9) configuration that looks like this: namespace { type = public separator = . prefix = news. location = maildir:/var/spool/news subscriptions = no } I would like to make all the mailboxes in this namespace read-only. I've got the following configuration for the ACL plugin: plugin { acl = vfile:/etc/dovecot/acls:cache_secs=300 } After perusing the documentation, it seemed as if I had a mailfolder /var/spool/news/.foo.bar that I could place the following into /var/spool/news/.foo.bar/dovecot-acl: anyone rl But that doesn't have any affect. I also tried creating a file /usr/local/etc/dovecot/acls/news.foo.bar with the same contents, but that didn't do anything, either. I've turned on mail debugging: mail_debug = yes But the log doesn't produce anything that appears to be relevant to ACL processing. I'm curious to know if anyone has gotten this to work correctly and if so if you could provide some configuration examples. Also, if there's any way to do this that doesn't involve per-mailbox configuration (.e.g, the ability to apply an ACL to news.* or something), that would be awesome. Getting the documented behavior for default ACLs working would be a step in the right direction.

    Read the article

  • Unique Business Value vs. Unique IT

    - by barry.perkins
    When the age of computing started, technology was new, exciting, full of potential and had a long way to grow. Vendor architectures were proprietary, and limited in function at first, growing in capability and complexity over time. There were few if any "standards", let alone "open standards" and the concepts of "open systems", and "open architectures" were far in the future. Companies employed intelligent, talented and creative people to implement the best possible solutions for their company. At first, those solutions were "unique" to each company. As time progressed, standards emerged, companies shared knowledge, business capability supplied by technology grew, and companies continued to expand their use of technology. Taking advantage of change required companies to struggle through periodic "revolutionary" change cycles, struggling through costly changes that were fraught with risk, resulted in solutions with an increasingly shorter half-life, and frequently required altering existing business processes and retraining employees and partner businesses. The pace of technological invention and implementation grew at an ever increasing rate, making the "revolutionary" approach based upon "proprietary" or "closed" architectures or technologies no longer viable. Concurrent with the advancement of technology, the rate of change in business increased, leading us to the incredibly fast paced, highly charged, and competitive global economy that we have today, where the most successful companies are companies that are good at implementing, leveraging and exploiting change. Fast forward to today, a world where dramatic changes in business and technology happen continually, a world where "evolutionary" change is crucial. Companies can no longer afford to build "unique IT", nor can they afford regular intervals of "revolutionary" change, with the associated costs and risks. Human ingenuity was once again up to the task, turning technology into a platform supporting business through evolutionary change, by employing "open": open standards; open systems; open architectures; and open solutions. Employing "open", enables companies to implement systems based upon technology, capability and standards that will evolve over time, providing a solid platform upon which a company can drive business needs, requirements, functions, and processes down into the technology, rather than exposing technology to the business, allowing companies to focus on providing "unique business value" rather than "unique IT". The big question! Does moving from "older" technology that no longer meets the needs of today's business, to new "open" technology require yet another "revolutionary change"? A "revolutionary" change with a short half-life, camouflaging reality with great marketing? The answer is "perhaps". With the endless options available to choose from, it is entirely possible to implement a solution that may work well today, but in 5 years time will become yet another albatross for the company to bear. Some solutions may look good today, solving a budget challenge by reducing cost, or solving a specific tactical challenge, but result in highly complex environments, that may be difficult to manage and maintain and limit the future potential of your business. Put differently, some solutions might push today's challenge into the future, resulting in a more complex and expensive solution. There is no such thing as a "1 size fits all" IT solution for business. If all companies implemented business solutions based upon technology that required, or forced the same business processes across all businesses in an industry, it would be extremely difficult to show competitive advantage through "unique business value". It would be equally difficult to "evolve" to meet or exceed business needs and keep up with today's rapid pace of change. How does one ensure that they do not jump from one trap directly into another? Or to put it positively, there are solutions available today that can address these challenges and issues. How does one ensure that the buying decision of today will serve the business well for years into the future? Intelligent & Informed decisions - "buying right" In a previous blog entry, we discussed the value of linking tactical to strategic The key is driving the focus to what is best for your business, handling today's tactical issues while also aligning with a roadmap/strategy that is tightly aligned with your strategic business objectives. When considering the plethora of possible options that provide various approaches to solving today's complex business problems, it is extremely important to ensure that vendors supplying those options, focus on what is best for your business, supplying sufficient information, providing adequate answers to questions, addressing challenges, issues, concerns and objections honestly and openly, and focus on supplying solutions that are tailored for, and deliver the most business value possible for your business. Here are a few questions to consider relative to the proposed options that should help ensure that today's solution doesn't become tomorrow's problem. Do the proposed solutions: Solve the problem(s) you are trying to address? Provide a solid foundation upon which to grow/enhance your business? Provide tactical gains that align with and enable your strategic business goals/objectives? Provide an infrastructure that can be leveraged with subsequent projects? Solve problems for the business overall, the lines of business, or just IT? Simplify your current environment Provide the basis for business: Efficiency Agility Clarity governance, risk, compliance real time business visibility and trend analysis Does your IT staff have the knowledge/experience to successfully manage the proposed systems once they are deployed in production? Done well, you will be presented with options tailored to your business, that enable you to drive the "unique business value" necessary to help your business stand out from others, creating a distinct competitive advantage, delivering what your customers need, when they need it, so you can attract new customers, new business, and grow top line revenue, all at a cost that provides a strong Return on Investment/Return on Assets. The net result is growth with managed cost providing significantly improved profit margin and shareholder value.

    Read the article

  • CodeCritics.com: A no nonsense place for coders to critique code and raise awareness of standards and "good coding standards" [closed]

    - by Visionary Software Solutions
    StackOverflow has been a boon for increasing programming knowledge by allowing developers to ask for help and knowledge related to programming. Oftentimes these questions boil down to: This code is broken, fix it I don't know how to do this Is this the best approach (hard question to answer on StackExchange, but democratic) Oftentimes, however, these questions are discussed at a very high level. "I use web services with a proxy client to ..." But, as Grady Booch is fond of saying "the Truth is raw, naked, running code". Those high level descriptions can be accomplished in any ways. Programming is an Art, and there are an infinite number of different ways to do things. But some are better than others. A site devoted to Q&A can help increase knowledge...a site devoted to critique of code can help elevate standards and result in higher quality knowledge. By upvoting the most elegant ways to solve a short, concise problem statement, or just looking at a piece of code and saying "this is ugly, how can we fix it?" we can increase community participation in discussions about the substantive details of an approach: "is my commenting clear? "Is this 3 nested for-loops with a continue that breaks in a special case a good way of building an object?" "Does this extremely generic and polymorphic inheritance hierarchy have issues?") Code is an art/craft and science/engineering artifact. Doesn't it deserve the same type of review treatment as a painting and an experiment? For praising those that provide that moment of zen when looking at exceptionally good code that makes you believe in a better tomorrow, and panning those whose offal is so offensive that were you to meet them on the job you'd say "YOU! GET OUT!!!" Hence, CodeCritics. A collaborative critiquing platform in the style of StackOverflow focused solely on critiquing code that can act as a collaborative code review and assist in the discovery of Design Patterns.

    Read the article

  • Welcome to the ISV Migration Center (IMC) Team blog

    - by lukasz.romaszewski(at)oracle.com
    Normal 0 false false false EN-US X-NONE X-NONE MicrosoftInternetExplorer4 /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin:0in; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Calibri","sans-serif"; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-fareast-font-family:"Times New Roman"; mso-fareast-theme-font:minor-fareast; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin; mso-bidi-font-family:"Times New Roman"; mso-bidi-theme-font:minor-bidi;} Welcome to the ISV Migration Center (IMC) Team blog.The IMC is a a team of senior Oracle technical consultants who's aim is to enable partners to rapidly and successfully adopt and implement Oracle's latest technology.  The IMC consultants are trained and equipped to deliver leading-edge, enterprise-quality technology solutions. This blog has been created to serve as an  information exchange platform on Oracle Fusion Middleware and Database products so you will find how-tos, articles, demos and other technical resources.  We will also publish our upcoming workshops, webcasts and seminars so make sure you check it regularly to get the latest updates.   Here's our team:Lukasz Romaszewski Java & middleware specialist, 8 years experience in architecting, developing and supporting enterprise solutions based on J2EE and Oracle Database technology. At Oracle from April 2008, working as an IMC Migration Consultant in Oracle Partner Hub in Cracow, Poland. Helping Oracle Partners in migrating their solutions to the latest Oracle Fusion Middleware stack, running hands-on migration workshops and seminars across Europe. Experienced in the following areas and products Oracle Weblogic Application Server 11gApplication Development Framework (ADF)Oracle SOA Suite 11gOracle Forms 6i, 10g and 11gOracle Database (PL/SQL, AQ, XML DB)Java EE 5.0 based architecture Murat Teksoz Oracle DB and DB options - Oracle Linux- Apex- Oracle Business intelligence specilist, 13 years experince in Database managment, Performans Tuning, Diagnosting ,Installation and Configurationg database, Database Security, High Avalibility and Disaster Recovery solutions. Working at Oracle IMC Istanbul from September 2008, delivering partner workshops and seminars in Europe and Central Asia. Experienced in the following areas and products Oracle 9i,10g,11g Database SolutionsOracle Partitioning, Total Recall Advantage compressingOracle High Avalability Solutions - Real Application ClusterOracle Disaster Recovery Solutions - Oracle DataguardOracle Grid ControlOracle LinuxOracle Business intelligence solutions - Oracle Bi 10g-11gMigration Tools (Sqldeveloper) - Migrate from SqlServer,Mysql,Sysbase,Db2 to Oracle DatabaseOracle APEX (Application Express Tool) Vadim Melnikov Oracle Database specialist with DB Options, Linux and virtualization skills. Vadim has more than 8 years experience with Oracle products and is now working as Database consultant in Oracle IMC Moscow as employee of FORS Development center, Russian Oracle Platinum partner. Helping Oracle Partners to migrate solutions to Oracle from other platforms and adopt new oracle technologies, running workshops and seminars. Experienced in the following areas and products Oracle Database 9i,10g,11g Database Solutions (SQL, PL/SQL, Installing, Configuring, Performance Tuning, Diagnosting, Database management)Oracle DB options (Partitioning, Total Recall, Advanced compression)Oracle Enterprise ManagerOracle Enterprise LinuxOracle VM 2 for x86Migration to Oracle DatabaseOracle Application Express Gokhan Gungor Java (J2EE) Lead Developer and Architect. Designed and Developed Web Applications, Middleware Systems/Services, Desktop Applications and Back-end Tools/Services using Java, WebLogic Server, JBoss and Open Source Frameworks. Joined Oracle in 2010 as Fussion middleware consultant in Istanbul IMC , responsible for running migration and adoption workshops and seminars covering Java technology, ADF, WebLogic and SOA and providing technical consultancy for migration projects. Experienced in the following areas and products Oracle WebLogic ServerApplication Development Framework (ADF)JDeveloperJava EE (EJB, JMS, Servlet, JSP, JSF, JavaMail, JTA, JAAS, JSTL, JAXB)Java SE (JavaBeans, JDBC, XML, XSL, RMI, JNDI, JAXP)Oracle Database 10g,11g Dmitry Nefedkin Oracle Middleware & Java specialist, 7+ years experience in developing, designing enterprise solutions based on Oracle Database and Middleware, developing Oracle e-Business Suite customizations, designing integration architecture within the companies . Joined Oracle team in October 2010 as IMC FMW Consultant in Oracle Alliances & Channels in Moscow, Russia. Experienced in the following areas and products Oracle Weblogic Application Server 11gOracle Service Bus 11gOracle SOA Suite 10g (BPEL PM, ESB, OWSM)Oracle Application Server 10gOracle Forms 6i and 9iOracle BI PublisherOracle ADF 10gOracle Database (SQL tuning, PL/SQL, AQ, Streams)Java EE 5 developmentCheck out our web site as well: Normal 0 false false false EN-US X-NONE X-NONE MicrosoftInternetExplorer4 /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin:0in; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Calibri","sans-serif"; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-fareast-font-family:"Times New Roman"; mso-fareast-theme-font:minor-fareast; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin; mso-bidi-font-family:"Times New Roman"; mso-bidi-theme-font:minor-bidi;} http://www.oracle.com/partners/en/most-popular-resources/027930

    Read the article

  • MapRedux - PowerShell and Big Data

    - by Dittenhafer Solutions
    MapRedux – #PowerShell and #Big Data Have you been hearing about “big data”, “map reduce” and other large scale computing terms over the past couple of years and been curious to dig into more detail? Have you read some of the Apache Hadoop online documentation and unfortunately concluded that it wasn't feasible to setup a “test” hadoop environment on your machine? More recently, I have read about some of Microsoft’s work to enable Hadoop on the Azure cloud. Being a "Microsoft"-leaning technologist, I am more inclinded to be successful with experimentation when on the Windows platform. Of course, it is not that I am "religious" about one set of technologies other another, but rather more experienced. Anyway, within the past couple of weeks I have been thinking about PowerShell a bit more as the 2012 PowerShell Scripting Games approach and it occured to me that PowerShell's support for Windows Remote Management (WinRM), and some other inherent features of PowerShell might lend themselves particularly well to a simple implementation of the MapReduce framework. I fired up my PowerShell ISE and started writing just to see where it would take me. Quite simply, the ScriptBlock feature combined with the ability of Invoke-Command to create remote jobs on networked servers provides much of the plumbing of a distributed computing environment. There are some limiting factors of course. Microsoft provided some default settings which prevent PowerShell from taking over a network without administrative approval first. But even with just one adjustment, a given Windows-based machine can become a node in a MapReduce-style distributed computing environment. Ok, so enough introduction. Let's talk about the code. First, any machine that will participate as a remote "node" will need WinRM enabled for remote access, as shown below. This is not exactly practical for hundreds of intended nodes, but for one (or five) machines in a test environment it does just fine. C:> winrm quickconfig WinRM is not set up to receive requests on this machine. The following changes must be made: Set the WinRM service type to auto start. Start the WinRM service. Make these changes [y/n]? y Alternatively, you could take the approach described in the Remotely enable PSRemoting post from the TechNet forum and use PowerShell to create remote scheduled tasks that will call Enable-PSRemoting on each intended node. Invoke-MapRedux Moving on, now that you have one or more remote "nodes" enabled, you can consider the actual Map and Reduce algorithms. Consider the following snippet: $MyMrResults = Invoke-MapRedux -MapReduceItem $Mr -ComputerName $MyNodes -DataSet $dataset -Verbose Invoke-MapRedux takes an instance of a MapReduceItem which references the Map and Reduce scriptblocks, an array of computer names which are the remote nodes, and the initial data set to be processed. As simple as that, you can start working with concepts of big data and the MapReduce paradigm. Now, how did we get there? I have published the initial version of my PsMapRedux PowerShell Module on GitHub. The PsMapRedux module provides the Invoke-MapRedux function described above. Feel free to browse the underlying code and even contribute to the project! In a later post, I plan to show some of the inner workings of the module, but for now let's move on to how the Map and Reduce functions are defined. Map Both the Map and Reduce functions need to follow a prescribed prototype. The prototype for a Map function in the MapRedux module is as follows. A simple scriptblock that takes one PsObject parameter and returns a hashtable. It is important to note that the PsObject $dataset parameter is a MapRedux custom object that has a "Data" property which offers an array of data to be processed by the Map function. $aMap = { Param ( [PsObject] $dataset ) # Indicate the job is running on the remote node. Write-Host ($env:computername + "::Map"); # The hashtable to return $list = @{}; # ... Perform the mapping work and prepare the $list hashtable result with your custom PSObject... # ... The $dataset has a single 'Data' property which contains an array of data rows # which is a subset of the originally submitted data set. # Return the hashtable (Key, PSObject) Write-Output $list; } Reduce Likewise, with the Reduce function a simple prototype must be followed which takes a $key and a result $dataset from the MapRedux's partitioning function (which joins the Map results by key). Again, the $dataset is a MapRedux custom object that has a "Data" property as described in the Map section. $aReduce = { Param ( [object] $key, [PSObject] $dataset ) Write-Host ($env:computername + "::Reduce - Count: " + $dataset.Data.Count) # The hashtable to return $redux = @{}; # Return Write-Output $redux; } All Together Now When everything is put together in a short example script, you implement your Map and Reduce functions, query for some starting data, build the MapReduxItem via New-MapReduxItem and call Invoke-MapRedux to get the process started: # Import the MapRedux and SQL Server providers Import-Module "MapRedux" Import-Module “sqlps” -DisableNameChecking # Query the database for a dataset Set-Location SQLSERVER:\sql\dbserver1\default\databases\myDb $query = "SELECT MyKey, Date, Value1 FROM BigData ORDER BY MyKey"; Write-Host "Query: $query" $dataset = Invoke-SqlCmd -query $query # Build the Map function $MyMap = { Param ( [PsObject] $dataset ) Write-Host ($env:computername + "::Map"); $list = @{}; foreach($row in $dataset.Data) { # Write-Host ("Key: " + $row.MyKey.ToString()); if($list.ContainsKey($row.MyKey) -eq $true) { $s = $list.Item($row.MyKey); $s.Sum += $row.Value1; $s.Count++; } else { $s = New-Object PSObject; $s | Add-Member -Type NoteProperty -Name MyKey -Value $row.MyKey; $s | Add-Member -type NoteProperty -Name Sum -Value $row.Value1; $list.Add($row.MyKey, $s); } } Write-Output $list; } $MyReduce = { Param ( [object] $key, [PSObject] $dataset ) Write-Host ($env:computername + "::Reduce - Count: " + $dataset.Data.Count) $redux = @{}; $count = 0; foreach($s in $dataset.Data) { $sum += $s.Sum; $count += 1; } # Reduce $redux.Add($s.MyKey, $sum / $count); # Return Write-Output $redux; } # Create the item data $Mr = New-MapReduxItem "My Test MapReduce Job" $MyMap $MyReduce # Array of processing nodes... $MyNodes = ("node1", "node2", "node3", "node4", "localhost") # Run the Map Reduce routine... $MyMrResults = Invoke-MapRedux -MapReduceItem $Mr -ComputerName $MyNodes -DataSet $dataset -Verbose # Show the results Set-Location C:\ $MyMrResults | Out-GridView Conclusion I hope you have seen through this article that PowerShell has a significant infrastructure available for distributed computing. While it does take some code to expose a MapReduce-style framework, much of the work is already done and PowerShell could prove to be the the easiest platform to develop and run big data jobs in your corporate data center, potentially in the Azure cloud, or certainly as an academic excerise at home or school. Follow me on Twitter to stay up to date on the continuing progress of my Powershell MapRedux module, and thanks for reading! Daniel

    Read the article

  • Lubuntu wireless issue with Broadcom chipset

    - by Variant Web Solutions
    I'm a web dev that just started a new venture in buying wiped laptops in bulk and selling them at a low cost to lower income families that want a laptop that will simply preform and not become virus ridden and require constant maintenance, so naturally I opted for a linux distro and after some research, lubuntu was my top pick. I'm not a stranger to linux as all of my servers for my web dev business are linux, but I am however new to L/Ubuntu and I'm having some issues with both wifi (broadcom chipsets) on the 20 or so dells that I have right now, D800's, D810's and E5400's. Not sure if you can point me in a solid direction, I've scoured (and implemented) the suggestions on ask ubuntu and still coming up short. On one of the e5400's ( though they all seem to suffer the same errors) I got the following: [code]dell-latitude-e5400@dell-Latitude-E5400:~$ iwconfig lo no wireless extensions. eth0 no wireless extensions. dell-latitude-e5400@dell-Latitude-E5400:~$ lspci 00:00.0 Host bridge: Intel Corporation Mobile 4 Series Chipset Memory Controller Hub (rev 07) 00:02.0 VGA compatible controller: Intel Corporation Mobile 4 Series Chipset Integrated Graphics Controller (rev 07) 00:02.1 Display controller: Intel Corporation Mobile 4 Series Chipset Integrated Graphics Controller (rev 07) 00:1a.0 USB controller: Intel Corporation 82801I (ICH9 Family) USB UHCI Controller #4 (rev 02) 00:1a.1 USB controller: Intel Corporation 82801I (ICH9 Family) USB UHCI Controller #5 (rev 02) 00:1a.2 USB controller: Intel Corporation 82801I (ICH9 Family) USB UHCI Controller #6 (rev 02) 00:1a.7 USB controller: Intel Corporation 82801I (ICH9 Family) USB2 EHCI Controller #2 (rev 02) 00:1b.0 Audio device: Intel Corporation 82801I (ICH9 Family) HD Audio Controller (rev 02) 00:1c.0 PCI bridge: Intel Corporation 82801I (ICH9 Family) PCI Express Port 1 (rev 02) 00:1c.1 PCI bridge: Intel Corporation 82801I (ICH9 Family) PCI Express Port 2 (rev 02) 00:1c.4 PCI bridge: Intel Corporation 82801I (ICH9 Family) PCI Express Port 5 (rev 02) 00:1d.0 USB controller: Intel Corporation 82801I (ICH9 Family) USB UHCI Controller #1 (rev 02) 00:1d.1 USB controller: Intel Corporation 82801I (ICH9 Family) USB UHCI Controller #2 (rev 02) 00:1d.2 USB controller: Intel Corporation 82801I (ICH9 Family) USB UHCI Controller #3 (rev 02) 00:1d.7 USB controller: Intel Corporation 82801I (ICH9 Family) USB2 EHCI Controller #1 (rev 02) 00:1e.0 PCI bridge: Intel Corporation 82801 Mobile PCI Bridge (rev 92) 00:1f.0 ISA bridge: Intel Corporation ICH9M LPC Interface Controller (rev 02) 00:1f.2 IDE interface: Intel Corporation 82801IBM/IEM (ICH9M/ICH9M-E) 2 port SATA Controller [IDE mode] (rev 02) 00:1f.3 SMBus: Intel Corporation 82801I (ICH9 Family) SMBus Controller (rev 02) 00:1f.5 IDE interface: Intel Corporation 82801IBM/IEM (ICH9M/ICH9M-E) 2 port SATA Controller [IDE mode] (rev 02) 02:01.0 CardBus bridge: Ricoh Co Ltd RL5c476 II (rev ba) 02:01.1 FireWire (IEEE 1394): Ricoh Co Ltd R5C832 IEEE 1394 Controller (rev 04) 02:01.2 SD Host controller: Ricoh Co Ltd R5C822 SD/SDIO/MMC/MS/MSPro Host Adapter (rev 21) 09:00.0 Ethernet controller: Broadcom Corporation NetXtreme BCM5761e Gigabit Ethernet PCIe (rev 10) dell-latitude-e5400@dell-Latitude-E5400:~$ rfkill Usage: rfkill [options] command Options: --version show version (0.5-1ubuntu1 (Ubuntu)) Commands: help event list [IDENTIFIER] block IDENTIFIER unblock IDENTIFIER where IDENTIFIER is the index no. of an rfkill switch or one of: all wifi wlan bluetooth uwb ultrawideband wimax wwan gps fm nfc dell-latitude-e5400@dell-Latitude-E5400:~$ [/code]

    Read the article

  • Part-time Programming Job London

    - by Bluechip Solutions
    I am a student at Middlesex Universtity, London studying Information Technology. I really love software development and I have taught myself how to write HTML + CSS, JavaScript (I use jQuery and AngularJS) and Java (I learnt this in school). I have developed few apps (a desktop app in Java and a mobile app with AngularJS and PhoneGap) I am looking at applying for a part-time programming job to develop myself. Are there part time jobs available for someone like me and are my skill set enough to get me a job? I understand this topic may not be ideal here but this is the only place I know can provide me answers. Thank you!

    Read the article

  • preg_match to match an optional string, but not match all of the string

    - by buggedcom
    Take for example the following regex match. preg_match('!^publisher/([A-Za-z0-9\-\_]+)/([0-9]+)/([0-9]{4})-(january|february|march|april|may|june|july|august|september|october|november|december):([0-9]{1,2})-([0-9]{1,2})/([A-Za-z0-9\-\_]+)/([0-9]+)(/page-[0-9]+)?$!', 'publisher/news/1/2010-march:03-23/test_title/1/page-1', $matches); print_r($matches); It produces the following: Array ( [0] => publisher/news/1/2010-march:03-23/test_title/1/page-1 [1] => news [2] => 1 [3] => 2010 [4] => march [5] => 03 [6] => 23 [7] => test_title [8] => 1 [9] => /page-1 ) However as the last match is optional it can also work with matching the following "publisher/news/1/2010-march:03-23/test_title/1". My problem is that I want to be able to match (/page-[0-9]+) if it exists, but match only the page number so "publisher/news/1/2010-march:03-23/test_title/1/page-1" would match like so: Array ( [0] => publisher/news/1/2010-march:03-23/test_title/1/page-1 [1] => news [2] => 1 [3] => 2010 [4] => march [5] => 03 [6] => 23 [7] => test_title [8] => 1 [9] => 1 ) I've tried the following regex '!^publisher/([A-Za-z0-9\-\_]+)/([0-9]+)/([0-9]{4})-(january|february|march|april|may|june|july|august|september|october|november|december):([0-9]{1,2})-([0-9]{1,2})/([A-Za-z0-9\-\_]+)/([0-9]+)/?p?a?g?e?-?([0-9]+)?$!' This works, however it will also match "publisher/news/1/2010-march:03-23/test_title/1/1". I have no idea to perform a match but not have it come back in the matches? Is it possible in a single regex?

    Read the article

  • Need help in displaying data insider marquee

    - by user59637
    Hi all, I want to display news inside the marquee markup in my banking application but its not happening.Please somebody help me what is the error in my code.Here is my code: <marquee bgcolor="silver" direction="left" id="marq1" runat="server" behavior="scroll" scrolldelay="80" style="height: 19px" width="565"> <% String se = Session["countnews"].ToString(); for (int i = 0; i < int.Parse("" +se); i++) { %> <strong><%Response.Write("&nbsp;&nbsp;" + Session["news"+i] + "&nbsp;&nbsp;"); %></strong> <% } %> </marquee> public class News { DataSet ds = new DataSet("Bank"); SqlConnection conn; String check; SqlDataAdapter sda; int i; public string News_Name; public int Count_News; public int newsticker() { conn = new SqlConnection(ConfigurationManager.ConnectionStrings["BankingTransaction"].ConnectionString.ToString()); check = "Select NewsTitle from News where NewsStatus = 'A'"; sda = new SqlDataAdapter(check, conn); sda.Fill(ds, "News"); if (ds.Tables[0].Rows.Count > 0) { for (i = 0; i < ds.Tables[0].Rows.Count; i++) { News_Name =i+ ds.Tables[0].Rows[i].ItemArray[0].ToString(); } Count_News = ds.Tables[0].Rows.Count; } else { News_Name =0+ "Welcome to WestSide Bank Online Web site!"; Count_News = 1; } return int.Parse(Count_News.ToString()); } protected void Page_Load(object sender, EventArgs e) { News obj = new News(); try { obj.newsticker(); Session["news"] = obj.News_Name.ToString(); Session["countnews"] = obj.Count_News.ToString(); } catch (SqlException ex) { Response.Write("Error in login" + ex.Message); Response.Redirect("Default.aspx"); } finally { obj = null; } }

    Read the article

  • Where to place logic in a rich domain model

    - by Fino
    I have a model "news item" which contains text, image etc to display as latest news on several pages in a website. This "news item" can also be posted to Twitter or Facebook. Is it clean to implement a method post inside the news item model and inject the different post implementations as a strategy? Or is it better to have a separate application service for this? Thanks

    Read the article

  • fetching only new rows from mysql with jquery ajax

    - by testkhan
    i have a table named news with 3 fields i.e (id, news, time) and i have a setInterval after every 3mints to fetch news from google or any news site .... now i want to fetch only new rows inserted after every 5 minutes...with jquery $.ajax()...how can i do that... do i reload the whole table or there is a way to fetch only the new ones...

    Read the article

  • VISIT ORACLE LINUX PAVILION @ORACLE OPENWORLD

    - by Zeynep Koch
    Back by popular demand, Oracle will again host the Oracle Linux Pavilionat Oracle OpenWorld from October 1-3. The pavilion will be located in the Exhibition Hall at Moscone South, Booth 1033, next to the Oracle DEMOgrounds and Oracle Linux demopods. At the pavilion a select group of ISVs, IHVs, and SIs will showcase their products that have been Oracle Linux- and/or Oracle VM-certified. These certified products enable customer applications to run faster, thereby saving money.Partners exhibiting their solutions in the Oracle Linux Pavilion include: BeyondTrust: context-aware security intelligence for dynamic IT infrastructures such as cloud, mobile, and virtual technologies Centrify: control, secure, and audit access to cross-platform systems, mobile devices, and applications Data Intensity: cloud services and application management Fujitsu: technology platforms, private cloud, services, ubiquitous and device solutions HP: converged cloud, converged infrastructure, application transformation, and information optimization LSI: intelligent solid-state storage solutions for breakthrough database acceleration Mellanox: InfiniBand and Ethernet end-to-end server and storage interconnect solutions and services for data centers Micro Focus: mainframe solutions, application modernization and development tools, software quality tools NetApp: storage and data management QLogic: high performance networking Teleran: BI and data warehouse management solutions for Oracle Exadata Database Machine and Oracle Database Be sure to pick up your free Oracle Linux and Oracle VM DVD Kit if you visit one of these partners. And speaking of free, be sure to stop by for some cool treats, courtesy of sponsor QLogic: Smoothie Bar on Monday, October 1 from 2:30 p.m. - 5:30p.m. Ice Cream Social on Wednesday, October 3 from 1:00 p.m. - 2:00 p.m. We look forward to seeing you at the pavilion.

    Read the article

  • How can I create (or do I even need to create) an alias of a DNS MX record?

    - by AKWF
    I am in the process of moving my DNS records from Network Solutions to the Amazon Route 53 service. While I know and understand a little about the basic kinds of records, I am stumped on how to create the record that will point to the MX record on Network Solutions (if I'm even saying that right). On Network Solutions I have this: Mail Servers (MX Records) Note: Mail Servers are listed in rank order myapp.net Add Sub-Domain MXMailServer(Preference) TTL inbound.myapp.net.netsolmail.net.(10) 7200 Network Solutions E-mail I have read that the payload for an MX record state that it must point to an existing A record in the DNS. Yet in the example above, that inbound.myapp... record only has the words "Network Solutions E-mail" next to it. Our email is hosted at Network Solutions. I have already created the CNAME records that look like this: mail.myapp.net 7200 mail.mycarparts.net.netsolmail.net. smtp.myapp.net 7100 smtp.mycarparts.net.netsolmail.net. Since I am only using Amazon as the DNS, do I even need to do anything with that MX record? I appreciate your help, I googled and researched this before I posted, this is my first post on webmasters although I've been on SO for a few years.

    Read the article

  • Oracle Linux Pavilion is Back for Oracle OpenWorld

    - by Oracle OpenWorld Blog Team
    By Zeynep Koch Back by popular demand, Oracle will again host the Oracle Linux Pavilion at Oracle OpenWorld from October 1-3. The pavilion will be located in the Exhibition Hall at Moscone South, Booth 1033, next to the Oracle DEMOgrounds and Oracle Linux demopods. At the pavilion a select group of ISVs, IHVs, and SIs will showcase their products that have been Oracle Linux- and/or Oracle VM-certified. These certified products enable customer applications to run faster, thereby saving money.Partners exhibiting their solutions in the Oracle Linux Pavilion include: BeyondTrust: context-aware security intelligence for dynamic IT infrastructures such as cloud, mobile, and virtual technologies Centrify: control, secure, and audit access to cross-platform systems, mobile devices, and applications Data Intensity: cloud services and application management Fujitsu: technology platforms, private cloud, services, ubiquitous and device solutions HP: converged cloud, converged infrastructure, application transformation, and information optimization LSI: intelligent solid-state storage solutions for breakthrough database acceleration Mellanox: InfiniBand and Ethernet end-to-end server and storage interconnect solutions and services for data centers Micro Focus: mainframe solutions, application modernization and development tools, software quality tools NetApp: storage and data management QLogic: high performance networking Teleran: BI and data warehouse management solutions for Oracle Exadata Database Machine and Oracle Database Be sure to pick up your free Oracle Linux and Oracle VM DVD Kit if you visit one of these partners. We look forward to seeing you at the pavilion.

    Read the article

  • Oracle Product Hub: Customer Perspectives at the OpenWorld

    - by Mala Narasimharajan
     By Rohit Tandon The Oracle Product Hub (OPH) Product Strategy team will be hosting a customer session dedicated to OPH at Oracle Openworld. Oracle Product Information Management strategy team will have the pleasure to present this session with Motorola Mobility Solutions.  . In this session, you will hear how Motorola Solutions utilizes OPH to meet their IT and business needs. Arif Girniwala, (MDM Lead, Motorola) and Chirag Jariwala (Manager, Deloitte Consulting) will cover the following topics amongst others: How does Motorola Solutions decide on what is Product Master Data for their enterprise? What are the Data Governance structures, Users, User roles, User Security etc. within Motorola Solutions?  How does Motorola Solutions integrate, synchronize and leverage OPH with Agile PLM?       4.  What is the Oracle Product Hub strategy and roadmap (Speaker - Sachin Patel, Director Oracle Product Hub Strategy)       5.  What are the implementation best practices for Oracle Product Hub (Speaker - Srikant Bevara, Sr. Manager, Oracle   Product Hub product management) If you're interested in hearing more about the above then I recommend attending this session: Customer Perspectives: Master Product Data: Strategies for Effective Product Information Management with Motorola Mobility Solutions (CON8834) Tuesday October, 2nd 10:15am - 11:15am Moscone West - 3001 We hope to see you at OOW 2012 and stay in touch via our future blogs!  For a list of all Oracle MDM sessions click here. 

    Read the article

  • Who are the SOA experts? Specialization recognized by customers

    - by Jürgen Kress
    You are looking for the SOA experts to deliver an successful project - contact our Oracle SOA Specialized partners - you can recognize them by the logo, the plaques and in the solutions catalog: Plaques SOA Specialized We would like to offer you a nice SOA Specialization plaque  with your logo to proof your success. If you are a SOA Specialized partner and would like to request the plaque please send Brigitte an e-mail with the following information: Partner Name Partner logo (preferred eps file) Partner Status gold or platinum We recommend to mount the plaque at your office reception in addition you can use the SOA Specialization logos at your website Download Logo: Gold & Platinum Solutions Catalog Please make sure that your Oracle Partner Network administrator will add your achieved Specializations to the Oracle Solutions catalog We started to promote at our website www.oracle.com/soa the find a Specialized Partner who added their Service Oriented Architecture Specialization in the solutions catalog. For administration please visit manage solutions catalog within OPN For detailed tutorial and an faq please visit. http://tinyurl.com/Catalogorcl   For more information on SOA Specialization and special SOA please make sure that you read the SOA & Application Grid Specialization Guide and the SOA & Application Grid Specialization Checklist. Blog Twitter LinkedIn Mix Forum Wiki Website Technorati Tags: SOA Sepecialization,OPN,Oracle,SOA,Jürgen Kress,plaques,solutions catalog

    Read the article

  • The Oracle Platform

    - by Naresh Persaud
    Today’s enterprises typically create identity management infrastructures using ad-hoc, multiple point solutions. Relying on point solutions introduces complexity and high cost of ownership leading many organizations to rethink this approach. In a recent worldwide study of 160 companies conducted by Aberdeen Research, there was a discernible shift in this trend as businesses are now looking to move away from the point solution approach from multiple vendors and adopt an integrated platform approach. By deploying a comprehensive identity and access management strategy using a single platform, companies are saving as much as 48% in IT costs, while reducing audit deficiencies by nearly 35%. According to Aberdeen's research, choosing an integrated suite or “platform” of solutions for Identity Management from a single vendor can have many advantages over choosing “point solutions” from multiple vendors. The Oracle Identity Management Platform is uniquely designed to offer several compelling benefits to our customers.  Shared Services: Instead of separate solutions for - Administration, Authentication, Authorization, Audit and so on–  Oracle Identity Management offers a set of share services that allows these services to be consumed by each component in the stack and by developers of new applications  Actionable Intelligence: The most compelling benefit of the Oracle platform is ” Actionable intelligence” which means if there is a compliance violation, the same platform can fix it. And If a user is logging in from an un-trusted device or we detect an attack and act proactively on that information. Suite Interoperability: With the oracle platform the components all connect and integrated with each other. So if an organization purchase the platform for provisioning and wants to manage access, then the same platform can offer access management which leads to cost savings. Extensible and Configurable: With point solutions – you typically get limited ability to extend the tool to address custom requirements. But with the Oracle platform all of the components have a common way to extend the UI and behavior Find out more about the Oracle Platform approach in this presentation. Platform approach-series-the oracleplatform-final View more PowerPoint from OracleIDM

    Read the article

  • What production-ready SaaS (recurring billing) solutions are available for Rails?

    - by Benjamin Manns
    I am working on a software-as-a-service (SaaS) application and I am looking for a billing plugin of some sort that will manage my subscriptions, customers, and recurring billing. There is the RailsKits SaaS kit ($249.00), but I prefer to use open source software. I have also found maccman's saasy, but the phrase "At the moment this is alpha code - use at your own risk" makes me a tad bit nervous.

    Read the article

  • Extracting pure content / text from HTML Pages by excluding navigation and chrome content

    - by Ankur Gupta
    Hi, I am crawling news websites and want to extract News Title, News Abstract (First Paragraph), etc I plugged into the webkit parser code to easily navigate webpage as a tree. To eliminate navigation and other non news content I take the text version of the article (minus the html tags, webkit provides api for the same). Then I run the diff algorithm comparing various article's text from same website this results in similar text being eliminated. This gives me content minus the common navigation content etc. Despite the above approach I am still getting quite some junk in my final text. This results in incorrect News Abstract being extracted. The error rate is 5 in 10 article i.e. 50%. Error as in Can you Suggest an alternative strategy for extraction of pure content, Would/Can learning Natural Language rocessing help in extracting correct abstract from these articles ? How would you approach the above problem ?. Are these any research papers on the same ?. Regards Ankur Gupta

    Read the article

< Previous Page | 94 95 96 97 98 99 100 101 102 103 104 105  | Next Page >