{"_id": "35236", "title": "Hotel Reservation Request Booking Paypal PHP", "text": "I'm making a website for a small hotel in php. The hotel owners want a reservation system that uses paypal. They want people to see a calendar and choose a date to make a reservation. If the day has vacancy, they want the user to request booking a room. This would then require the hotel owner to accept the purchase. I have not worked on a project that has this \"request to purchase\" method of buying with paypal. Is this possible? Does anyone know of an open php system that handles this?"} {"_id": "35540", "title": "How to backup a dev & QA folder website structure?", "text": "A site I just became in charge of uses a really simple two folder structure to host the dev site and the QA site. The sites are hosted on the company servers so I just have the sites' folders mapped on my desktop. I would like to run some kind of backup scheme, but I am finding it hard to think of a way to do this effectively. The problem is that we aren't using any revision control software, and since the servers aren't controlled by me, I don't think I will be able to implement anything like that. Or could I? The entire site is static too, so no DB's or anything besides html, images, PDFs, etc."} {"_id": "35230", "title": "As an affiliate, how do you know if a sale is made?", "text": "I want to start doing affiliate marketing on a blog. Now I have someone who wants to advertise who has contacted me. How do I know if he has made a sale to a user who came through my website? Is this only possible to track using a third party in order to know he isn't lying? If so, what platforms are available for this kind of \"indpendent\" affiliate marketing? i.e I don't need the matchmaking service, just the tracking service. the blog is Wordpress if it matters"} {"_id": "35548", "title": "Remove URL from \"Fetch as Google\" - in Google Webmaster Tools", "text": "I need to remove a URL from _Health -> Fetch as Google_ in Google Webmaster Tools. Is it possible?"} {"_id": "35238", "title": "How can I redirect all files in a directory that doesn't conform to a certain filename structure?", "text": "I have a website where a previous developer had updated several webpages. The issue is that the developer had made each new webpage with new filenames, and deleted the old filenames. I've worked with .htaccess redirects for a few months now, and have some understanding of the usage, however, I am stumped with this task. The old pages were named like so: www.domain.tld/subdir/file.html The new pages are named: www.domain.tld/subdir/file-new-name.html The first word of all new files is the exact name of the old file, and all new files have the same last 2 words. www.domain.tld/subdir/file1-new-name.html www.domain.tld/subdir/file2-new-name.html www.domain.tld/subdir/file3-new-name.html ect. We also need to be able to access the url: www.domain.tld/subdir/ The new files have been indexed by google (the old urls cause 404s, and need redirected to the new so that google will be friendly), and the client wants to keep the new filenames as they are more descriptive. I've attempted to redirect it in many different ways without success, but I'll show the one that stumps me the most RewriteBase / RewriteCond %{THE_REQUEST} !^subdir/.*\\-new\\-name\\.html RewriteCond %{THE_REQUEST} !^subdir/$ RewriteRule ^subdir/(.*)\\.html$ http://www.domain.tld/subdir/$1\\-new\\-name\\.html [R=301,NC] When visiting `www.domain.tld/subdir/file1.html` in the browser, this causes a **403 Forbidden error** with a url like so: www.domain.tld/subdir/file1-new-name-new-name-new-name-new-name-new-name-new-name-new-name-new-name-new-name-new-name-new-name-new-name-new-name.html _I'm certain it's probably something simple that I'm overlooking, can someone please help me get a proper redirect? **Thanks so much in advance!_** _**EDIT_** I've also got all the old filenames saved on a separate document in case I need them set up like the following example: (file(1|2|3|4|5)|page(1|2|3|4|5)|a(l(l|lowed|ter)|ccept)"} {"_id": "48581", "title": "How to handle clients who want to add duplicate content to their sites", "text": "I have some clients who are requesting that I put pages on their websites that amount to duplicate content. One wants the exact same page on two of their websites. Another wants to put content from a brochure they have some right to reprint or distribute on their website. But the same brochure content is already online at the original author's website. I explain to my clients the possible consequences of publishing duplicate content to the web, such as the page will not be indexed by search engines or their web site may be penalized. But I really want to offer them alternative solutions. The obvious one is to link to the content on the original author's website if that is possible. Does anyone have any other strategies for dealing with client requests to put duplicate content on their websites?"} {"_id": "53865", "title": "3rd party site using our Google Analytic tracking code, show can we block it?", "text": "A 3rd party has stolen our site and is running it for there own purposes, they have also not bothered to change the Google Analytics tracking code, so there site is now skewing our data, is there a way we can block / disassociate their site from impacting on our Google Analytics data ?"} {"_id": "11540", "title": "Text files vs. text in database", "text": "I'm designing a new site, deciding whether to keep the bulk of the content for each page in a text file, with just the file name in a database record for that page, or to keep the entire text in the database record as a string. The text is typically a few hundred to a few thousand words, with embedded markup to include photos or whatever, to be processed in PHP before being sent out as html. I don't find enough enlightening discussion of this design choice online. What are the pros and cons of each way? The advantage of text files, I imagine, is easy access to the file by ftp or other means, making fixing typos or editing the material easy w/o having to fuss with the database at all. OTOH, keeping it directly in the db means I won't have to bother learning how to read text files in PHP 8P Seriously, what implications are there for maintaining the site, security, efficiency, and aspects I'm not thinking of?"} {"_id": "11541", "title": "Why is \"mac\" disallowed in microsoft.com/robots.txt?", "text": "At http://www.microsoft.com/robots.txt, why are there entries like: Disallow: /*/mac/help.mspx Disallow: /*/mac/help.mspx? Disallow: /*/mactopia/help.mspx? .... Could there be a real reason to not allow spiders to index those pages or is it something nefarious?"} {"_id": "503", "title": "What are the most important things I need to do to encourage Google Sitelinks?", "text": "My website used to have sitelinks and now it doesn't. It's very possible that it's due to changing the website to a sidebar design instead of having an \"interstitial\" type landing page which limited the number of choices, but I'm not sure. Here is how sitelinks might look for a site: ![Google Sitelinks](http://i.stack.imgur.com/sBaDc.jpg) What are some things that I can do to improve my chances of getting sitelinks?"} {"_id": "18547", "title": "How to best format your HTML to help Google display your websites navigation with the search result for your site in Google Search?", "text": "> **Possible Duplicate:** > What are the most important things I need to do to encourage Google > Sitelinks? Look at the first result for \"smashing magazine\" in the image below. See how Google grabbed SM's site navigation and displayed it below their search result? Are there any techniques that would help Google display the navigation like this? Are there any suggestions as to how to structure your HTML to help Google create this for your website's search result? Or in general, the best way to structure? Tricks of the trade? etc.? ![Smashing Magazine's navigation added to their search result in Google Search](http://i.stack.imgur.com/CNaQC.png)"} {"_id": "26908", "title": "How to get grid view presence in google search of my website as shown in google search for \u201ctwitter\u201d?", "text": "> **Possible Duplicate:** > What are the most important things I need to do to encourage Google > Sitelinks? I have my website live and it is visible in google search but I want it to be displayed in a grid view as shown in google search of key word for example \"twitter\". Can someone guide me how to make this happen?"} {"_id": "61307", "title": "Google sitelinks", "text": "How can I get Google to display sitelinks under search results when my website shows up? When my website shows up in position 1 which it does for a couple of keywords it displays sitelinks, but whenever we show up in any other position than the first one, we don't have any sitelinks displayed at all. I know it's not a matter of just being in the top position or not because I see sites all the time that are in any number of position in the search results and occasionally I'll see a site that has sitelinks even though they're like half way down page 3 or something.. ![enter image description here](http://i.stack.imgur.com/UpR0H.jpg)"} {"_id": "68588", "title": "How to show differnt page in same serp result", "text": "![enter image description here](http://i.stack.imgur.com/LZ2ty.png) How to Show Different Page heading On Same result page form one website i have seen so mant result like this like in a picture #any tutorial or resources"} {"_id": "44308", "title": "SEO - be first and Get my own company section in google for my company name", "text": "I want my company name to be first in Google when I search for it, since it is a unique name. I also want it to have its own \"section\" with sub-links of the \"about page\" \"product page\" \"contact us\" page like any other big companies have. For example, if I type any other big company, I get their website on the top row, but it also has sub-links of the \"contact us\" , \"about us\" and more... ![enter image description here](http://i.stack.imgur.com/iJQFB.png) When I search my company name in Google, I get two full search pages with my company name but all of them are from other sites that took my details from somewhere. **None of these links are my site pages!** I already read articles about \"keyword significance\" and \"crawl errors\" and most of the webmaster tools, but did I miss something else? How do I gather my site pages under my site link in Google? Is there a configuration for that?"} {"_id": "49966", "title": "How to customize website for Google first search Result?", "text": "I don't know what is terminology used for following examples. I want to customize my website so when user search it on internet and it will return result look like following example for first search result. ![enter image description here](http://i.stack.imgur.com/xzjDD.png) * * * ![enter image description here](http://i.stack.imgur.com/Bm96g.png)"} {"_id": "60680", "title": "Google Search Results: sub-page description", "text": "I have a Wordpress site that successfully has 6 sub-pages listed below the main result in Google. However a page content snippet is showing up or each instead of the sub-page meta descriptions. I have verified in the page source that the meta descriptions are being successfully inserted by my SEO plugin. How can I define the sub-page descriptions?"} {"_id": "52031", "title": "How to Make google Show My website pages in the way I want", "text": "Let us take an exapmle of ebay. ![enter image description here](http://i.stack.imgur.com/Dp5JD.png) As you can see in image when I search for ebay it also shows subpages like Deals,GlobalEasybay and etc..For my site also this happens but the problem is that I don't have any control over it.It displays random sub-pages.My First question How I can control this. Second You can see on the right column of my screenshot ,there is a nice presentation on ebay with a picture and rescent posts...How I can make something like this on my website?"} {"_id": "55814", "title": "How can my site show up on Google with last post date and secondary links?", "text": "How does one build a site to show up like this in the Google search results? ![A nice ](http://i.imgur.com/pGJSbiX.png)"} {"_id": "57308", "title": "SEO for Google: navigation, company info, video", "text": "I often stumble upon search results in Google that get a special treatment, like: * having the site navigation (and more) being displayed on the search results page ![Navigation, Location, Review, and special search](http://i.stack.imgur.com/IKnIB.png) * having a company's location, contact information, Google reviews and e.g. opening hours being displayed aside the search results ![location, contact information, reviews, etc.](http://i.stack.imgur.com/SMLRm.png) * having a video thumbnail being displayed with the search result and the navigation structure being displayed instead of the blank URL (`giga.de` > Deals) ![video thumbnail and navigation instead of url](http://i.stack.imgur.com/aRJZZ.png) I'm building a website for my (nationally pretty well known) choir but have not been into search engine optimization, yet. There's many blogs and documents on that topic. But what I want is that special behavior described above. Can you tell me how I can reach that? Or are there any documents that deal with these (and more?) special Google features?"} {"_id": "60875", "title": "How to show my links in structured way like Amazon do", "text": "Can you please tell me how links like `Books` `EC2` appear in search results associated with Amazon.com? How can I do similar with my site's main section? ![enter image description here](http://i.stack.imgur.com/vwMAx.png)"} {"_id": "22430", "title": "How to hide pages from Google crawler?", "text": "> **Possible Duplicate:** > What are the most important things I need to do to encourage Google > Sitelinks? I'm currently working on a website and need to keep certain pages hidden from Google crawler. How to make it so that search engines see only what I want them to see in a directory? Also, you know how Google results also give you shortcut links, Like 'Login', 'About' etc... how to put these links to search result?"} {"_id": "61682", "title": "Google places links and info with my search results", "text": "I had set up a google+ page a while ago and also crated a places page. I'm wondering how I can integrate this into my search results (An example shown in the attached screenshot). I'd like the additional pages, google+ links and map showing up on the side. I cannot for the life of me find how I would go about this. I assumed it was automatic, but after some time I have come to the conclusion that it must require some additional input from my end. Many thanks ![enter image description here](http://i.stack.imgur.com/mTqq0.jpg)"} {"_id": "978", "title": "How to get Google to display navigation?", "text": "> **Possible Duplicate:** > What are the most important things I need to do to encourage Google > Sitelinks? How do I get Google to display the navigation for my site like it does for others? For example, take the search results for **'Microsoft'** : ![](http://files.quickmediasolutions.com/google_example.png) How do I get my site to display navigation like that? It's already the first entry for certain keywords."} {"_id": "19518", "title": "Can I control the Google sitelinks for my website?", "text": "> **Possible Duplicate:** > What are the most important things I need to do to encourage Google > Sitelinks? Working on SEO for a website, and i'm wondering if there's a way to have some say in which sub-links appear under the first entry when you google the website's name. (i.e. when you google \"amazon\" there are 6 sub-links including \"books\", \"music\", \"your account\", etc. under the entry). Currently if I google my site those 6 links include a couple of random pages as well as the Privacy Policy and Terms of Service which are only linked to in the footer, but several of the main sections of the site that are linked to in the top menu are left out. (I say the pages it chooses are \"random\" because they don't correspond to the most-viewed based on google analytics, and I don't think special SEO-wise to make them stand out) Is there a way to choose or at least influence what these sub-links will be? Edit - What I call sub-links above are known as sitelinks (and are well documented in Google's Webmaster Tools, as Michael pointed out)."} {"_id": "42439", "title": "Unique search engine results", "text": "> **Possible Duplicate:** > What are the most important things I need to do to encourage Google > Sitelinks? ![Search engine result](http://i.stack.imgur.com/Dzrur.jpg) How do i get those kind of result? With Categories below the main result. What is it called. i want to do as the picture. thanks in advance!"} {"_id": "47982", "title": "Display major links in Google", "text": "I am wondering how some websites get their sublinks too displayed on Google. For example when I enter \"accenture\" on Google apart from the main link (which is common) look at the links \"Graduate Careers at accenture\", \"Accenture careers\" etc. How can I display my other pages too on Google just like this way. Also when somebody enters my business name how can I display my map on Google just like the below screenshot. Check the screenshot here. Sorry I don't have reputation 10 to post image. :( http://i.stack.imgur.com/fZFkv.png"} {"_id": "29411", "title": "How to add menu on google search result?", "text": "> **Possible Duplicate:** > What are the most important things I need to do to encourage Google > Sitelinks? I want to ask how to make menu on google search result.E.g : when you search on google with keyword : Google community and found www.stackoverflow.com, there are list of link below,those are Forums,Register Now,Members List,etc. How to make this menu in my website. For example: ![enter image description here](http://i.stack.imgur.com/o4wZC.png)"} {"_id": "25590", "title": "How do we get our sites' sections displayed under the description in a Google search?", "text": "> **Possible Duplicate:** > What are the most important things I need to do to encourage Google > Sitelinks? For Example, if I Google \"Stack Overflow\" the site comes up but also links to the main sections of the site right beneath it (Login, Questions, Careers...) ![enter image description here](http://i.stack.imgur.com/ZEDel.png) How can I achieve this with my own site? Thanks!"} {"_id": "35738", "title": "Google Site Links", "text": "How can I get google sitelinks?"} {"_id": "50877", "title": "How does Google determine important links on a website to display in search results?", "text": "Sometimes when I search for something on Google it shows some results (website links), but it also shows some important links on that website. Is it a feature of the website or does Google use something to find those main links of the website? Is it related to search engine optimization?"} {"_id": "58164", "title": "Multiple Pages in Google Results", "text": "I've got what I would have thought would be a pretty widely discussed problem but I cant seem to find an answer anywhere. This is more of a SEO question, but also relates to general HTML structure. I'm trying to get all of the pages in a site to to appear under one result in Google (preferably displaying the sub-pages underneath). I realise I can use robots.txt to block the index from viewing the other pages, but am I right in thinking the keywords in those pages will no longer contribute search ranking? Edit: I should add that at the minute each page shows as a different result, which is annoying as the home page isn't first! I have in the head section of every page (minus the JS and extra style sheets): XXXXX Perhaps I'm wrong in thinking I should have this on every page? If anyone could shed some light on this I'd really appreciate it!"} {"_id": "45424", "title": "Regarding sitelinks in google search", "text": "I would like my site to display additional site links when preforming a Google Search.. How can this be achieved?"} {"_id": "56395", "title": "From where does Google gets these links", "text": "I'm just about to optimize a website. Now I would like to know from where Google obtains these links: ![screenshot](http://i.stack.imgur.com/NhVTB.png) Why? I also want some links to appear on the search-result for my homepage"} {"_id": "61555", "title": "Why do some websites have extended search results in google?", "text": "Like this site: ![enter image description here](http://i.stack.imgur.com/sBaDc.jpg) What does the site do to look like that?"} {"_id": "67795", "title": "How to make a certain page of a website rank before others on the same website?", "text": "If I search for \"cnn news\" on Google, it appears as in the attached. 1) If I wanted the search term \"cnn news\" to make the CNN Student News appear first, would my best bet be to get anchor link text in a backlink such as `CNN News` ? 2) I'm unclear as to how Google decides which page to use as the main link and then display all the sub-links under it (as in the attached pic). I hypothesized that it's the anchor link text in backlinks that makes the most difference but I'm not sure.![enter image description here](http://i.stack.imgur.com/6L9Fw.png) Any tips would be appreciated."} {"_id": "9090", "title": "how to display menu of site in google results", "text": "> **Possible Duplicate:** > What are the most important things I need to do to encourage Google > Sitelinks? my site is first in google if you type ' _\u05e2\u05d5\"\u05d3 \u05de\u05d9\u05de\u05d5\u05df \u05db\u05d4\u05df_ ' or ' _maimon cohen lawyer_ '. I want to display menu index of the site in google, like this : ![ubuntu in google](http://i.stack.imgur.com/epRYQ.png) how to do it ?"} {"_id": "50142", "title": "how do we organize our site's (navigation) pages in Google search?", "text": "**Is there a HTML technique for organizing my site's main pages in Google (navigation)? Or is it made by a Google's specific tool for creating a result layout like in this picture?** ![enter image description here](http://i.stack.imgur.com/t9mdl.png) **Look at the difference of the results for my site:** ![enter image description here](http://i.stack.imgur.com/5HHU0.png)"} {"_id": "20936", "title": "How to organize your site on google like amazon?", "text": "> **Possible Duplicate:** > What are the most important things I need to do to encourage Google > Sitelinks? I don't know the name of this feature neither how to search it on google. But I would like my website to be shown like this: http://jode.com.br/Joe/amazonongoogle.png I have full access to the site and the tools of google to this site. thanks"} {"_id": "9908", "title": "Formatting Google Search Result", "text": "> **Possible Duplicate:** > What are the most important things I need to do to encourage Google > Sitelinks? Hello, I am new to search engine optimization. I am working on customizing how my results appear in Google as best as possible. I have learned about the meta tags to customize the text summary. However, I have some hierarchical parts to my website. When a result appears related to the \"tip-of-the-iceberg\", I would like to show links related to the \"child\" pages. For instance, if you Google \"Walmart\" you will see the following links listed with the result: * Electronics * TV & Video * Departments * Furniture * Toys * Girls * Living Room * Computers Is there any way that I can help Google determine which links to show and the text to display for these child links on my site? Or is this something that Google automatically generates? thanks!"} {"_id": "55873", "title": "Having Google structure my search result", "text": "My question is fairly simple: How do I customize my Google Search result so when it comes up in Google, it gets this kind of fancy result that lets an user navigate my website directly from the results? This is what I'm talking about: ![Google result](http://i.stack.imgur.com/aHxPi.png) How do I make Google make my result like that?"} {"_id": "18443", "title": "How are Google Site links generated in search results?", "text": "> **Possible Duplicate:** > What are the most important things I need to do to encourage Google > Sitelinks? I just met a fellow designer who runs a website with the same pagerank as mine. When googling both our names, search results will show Site links below his website, but not mine. Do you know which attributes are handled by Google to define when and how these links are shown in search results?"} {"_id": "67965", "title": "Choose which pages appear in Google search results snippet?", "text": "I was wondering sometimes, the first result in Google's results has a list of pages (in the website) below the title of the search result. Is this something I can force Google to do, with some sort of HTML tags? Or does Google just do it automatically, when it feels like it?"} {"_id": "27490", "title": "How to make Google show my site in search result like the following image?", "text": "> **Possible Duplicate:** > What are the most important things I need to do to encourage Google > Sitelinks? Currently Google is displaying my site (http://layzend.info) like this in search result, only the link and meta description without any internal page links - ![enter image description here](http://i.stack.imgur.com/izyiE.jpg) But I want to be the search result like the following where the internal links are also displayed - ![enter image description here](http://i.stack.imgur.com/4kjr8.jpg) How is it possible? Please help me to make my site more SEO friendly."} {"_id": "54942", "title": "Will using page headings (h1, h2, h3 etc) as internal links encourage Google sitelinks?", "text": "I am trying to change my site's page headings to better emphasize the structure of the page's content hierarchy. I am hoping this will encourage Google Sitelinks and list snippets. Current headings are not optimized: I would like to use the `

` heading as an internal link. Will this encourage Google sitelinks and better content indexing?

text

"} {"_id": "67816", "title": "how to show my site Homepage with page link", "text": "How to show my site like below in Google search result. please help me. even my Site Homepage is not showing properly in Google search result. **1st Pic:** I want to show my site in Google like this. **2nd Pic:** Currently My site showing like this way. **Pic 1:** ![search display in Google page](http://i.stack.imgur.com/2C1gq.png) ![enter image description here](http://i.stack.imgur.com/4un8M.png) **Pic 2:**"} {"_id": "11545", "title": "Video for an ad-driven website", "text": "I have a website, which i will fill with a bunch of useful videos. I've implemented an ads rotation engine for articles and will do so for videos. The next milestone is to decide how video will be integrated. They are two ways: 1. To host videos myself. Pros: complete freedom. Cons: need tens of gigabytes of storage; support for multiple formats to be cross-browser and cross-device. 2. Use YouTube. Pros: Very simple to use; nothing to do. What are pros and cons for each way? Some questions for YouTube: 1. Will I be able to control playback of YouTube-embedded video to make post-rolls ? 2. What is ranking impact on my web-site, when most of pages will refer to YouTube ? 3. Will, say, iPad play video, embedded via YouTube's iFrame ? 4. Does relying entirely on YouTube have a long-term perspective for a web-site, that should bring money ?"} {"_id": "11548", "title": "issue with wordpress theme background on widescreen computers", "text": "I am using the theme \"irresistible\" and I wanted to a add a picture of the person for whom I am building the site. Normally the background CSS uses repeat, but I don't want their picture to repeat i just want it all the way on the left side body.woothemes { background: #18191b url(images/bg.jpg); background-repeat:no-repeat; background-position:left top; color: #e9e9e9; font-family: Arial, Helvetica, sans-serif; font-size: 12px; line-height: 18px; } http://romesmooth1.jerseygetonline.com/ It doesn't repeat in Internet Explorer, but repeats in Firefox. In both it looks fine in a normal on a 17in LCD but not on a wide screen."} {"_id": "28998", "title": "SEO best practices for sites with few pages but lots of lookup information?", "text": "I have a site that allows people to look up words that \"Start with\" or \"End with\" a certain set of characters. I am trying to figure out how to get off on the right foot with search engines and I was wondering: Is it better to have URLs that appear to be unique pages? For example: mySite.com/StartsWith/pred instead of mySite.com/Words.aspx?StartsWith=pred Both of these would return the same data, but I am wondering if the first is better because it appears to be a unique page to a crawler? At the end of the day the source code will only contain about 6 pages in it, but with all of the StartsWith/EndsWith letter sets, there are probably millions of possible combinations that people could get to. How would I (and should I) create a site map for pages that do not physically exist, but have unique URLs with unique content? Are there any other steps I should take to make sure that crawlers can find all of these different combinations? **Update** There will be no duplicate content on the site"} {"_id": "55293", "title": "SEOquake is not recogonizng microformats", "text": "I generated Rich snippets (for organization) microcode for one of my websites, using instructions like https://support.google.com/webmasters/answer/146897?hl=en http://en.wikipedia.org/wiki/Microformat Problem I have is that Microformat is not recognized by tools like SEOquake. Can you help me how to generate microdata in proper format?"} {"_id": "5988", "title": "How to serve custom ads for clients on my site?", "text": "Are there any ad serving systems, that allow webmasters to serve ads from their clients on their sites? The client provides an image/text and a link and the website displays the ad in designated spaces. The website would display only ads from their own customers (nothing from a public ad network like adsense). Can you please provide examples of such ad serving solutions? Preferrably free/opensource."} {"_id": "5989", "title": "Plugin to back up a Wordpress (version 2.2.1) site?", "text": "Which Wordpress plugin would you recommend to someone with only basic PHP and MySQL skills to back up a quite large WP (engine version 2.2.1) site as safely as possible? The backup should allow redeploying the site with/to an updated, clean install of the WP engine."} {"_id": "5982", "title": "Wordpress vs Clean html/css", "text": "When it comes to designing/coding presentational websites, with static content and few pages.. let's say, between 5 and 20. Would you go with wp, design a custom theme and update with content, or would you code from scratch a clean html/css eventually + some js ? For now, since I'm just stepping into web development, I see wordpress as a complex cms that loads slow in my browser.. **EDIT:** I suppose a page generated with wordpress would load slower than a page that looks the same coded from scratch. Please help me find the pros/cons of both. How about seo/indexing? Is it worth learning wp for designing presentational websites ? P.S. I'm asking this because a lot of my friends, who are a bit more experienced than me, tend to recommend using wp because clients like it and it's easy, but I'm not sure yet. When It comes to you, what would you prefer and why ?"} {"_id": "51266", "title": "How to monitor traffic on a single page that I control when I don't control the domain?", "text": "I have a webpage like www.site.com/myaccount/foo/index.htm I have complete control over the HTML on this page (e.g., I can insert a script). However, I have no control over the rest of the site. **update:** Note that I only want to have information on traffic to the page under my control. I have no interest in the traffic for the rest of the site. I'd particularly like to get an idea of referring URLs to the page under my control. * **Will Google Analytics work on a single page under my control?** * **If not, are there other tools suited to monitoring traffic to a single page?** Note I saw this question but it did not seem to answer my question. Also, I'm not tied to using Google Analytics if another online system would work. Specifically, I have added the google analytics tracking code to the page, but am still getting the error > Status: Tracking Not Installed The Google Analytics tracking code has not > been detected on your website's home page. For Analytics to function, you or > your web administrator must add the code to each page of your website. This made me think that I might need to have access to `www.site.com/index.htm` **Update:** After waiting another day, Google Analytics appears to have registered the tracking code. I presumably just had to wait a little longer."} {"_id": "5980", "title": "SEO HTML for plain html css website", "text": "This isn't really a programming question, but I couldn't think of a place where people can better answer my question then here. I allways had a joomla website, for me it's good, but it's not very flexible. My question is: In joomla you have some great SEO tools, like labeling or tagging your articles. I'm planning to make my own css/html based website without a cms behind it. Does anyone knows a labeling/tagging script whichcan be used for \"normal\" html/css based websites without for example joomla?"} {"_id": "19394", "title": "New visits count disparity on a month old site in Google Analytics", "text": "I launched a new website less than a month ago but the stats on \"new visits\" doesn't appear to match with \"absolute unique visitors\". If I look at the dashboard, it shows 2040 visits, 956 unique visitors and 33.68% new visits. A \"new visit\" is someone who's never been to the site before. So in this case, shouldn't the \"new visits\" match the \"absolute unique visitors\"? ![dashboard of urchin](http://i.stack.imgur.com/xZYYS.jpg)"} {"_id": "5986", "title": "Thoughts on this approach for a news post caching system?", "text": "I'm working on a website which uses /news/title-of-news-article--111.html (111 = unique ID of the news article). I was thinking it might be a good idea to have some form of a caching system for the news articles. Essentially, my idea would be to have a .html version of the news article (i.e. just the news part that's dynamically generated) be created when it was marked live (making it so visitors and search engines can see it). The .php script that displays the content would check to see if a .html file with the news id exists, and would use that instead of querying the database. Of course, if the .html file didn't exists for some reason, it would fall back to querying the database like the site currently does now. Any issues you guys see with that approach? This is meant more for when a ton of visitors view a news article, since ideally I'd assume grabbing a \"static\" .html file is better on the server and the database than querying."} {"_id": "30925", "title": "Google Webmaster Tools proportional to Google exams?", "text": "I've taken no Google exams, I'm curious about apps becoming available in your Google webmasters tools and google apps the more exams you pass. Someone I know told me that the more exams you pass, the more tools become available to you but I cannot find any information about it anywhere online. Is this some sort of secret or am I just misinformed?"} {"_id": "5984", "title": "How important is the logo for a freelancer?", "text": "How important is a logo for a freelancer ? If I'm going to design a small static website to represent myself with 'about', 'skills', 'portfolio' ... etc. pages and I want to keep the design as simple as I can... Is it crucial to take lots of time and try to design an abstract logo that would represent me as a freelancer, or what I do ? Work I do: web design and development, custom solutions.. etc. Target audience: small business owners, and potential recruiters ?"} {"_id": "22261", "title": "How can I exclude content in my notifications bar from being indexed?", "text": "Of course I want my content to be indexed pretty fast by search engines, however not my notifications bar. My notifications bar contains the last 30 changes to content on the site, and I don't want this to show in my SEO meta. As all the notifications are generic, it often doesn't provide any relevant information. As I said the notifications are generic. If an article named \"123\" was created, it would create a notification that says \"Article \"123\" was created by xxx at 12:00AM\". I'm now wondering if this is a content design problem. As only 1/3 of this information is actually relevant to users (the title, what happened). By SEO meta, and irrelevant notification data being shown, I mean this - ![enter image description here](http://i.stack.imgur.com/3k6Sw.png) Basically what I was wondering, is how I could optimise this, so search engines wouldn't show this generic nonsense."} {"_id": "61179", "title": "Google Anlytics - User Type Tracking", "text": "just a question about a custom dashboard in GA. I'd like to to monitor only help/faq pages in my site. I've created a dashboard that filter only the pages I need vs Unique Page views, but I want a deeper analysis. For example, does it makes sense to track User Type (new/returning visitors), to know if the user that is viewing my help pages is someone who visit often these pages (like a reference resource to check everytime you need) or if they are mainly new visitors. In this case, it's better to track Sessions vs User type OR Unique Page Views vs User Type? Inside an help section, can we talk of sessions? Or session is something wider, meaningful only in the whole site perspective?"} {"_id": "22268", "title": "When redirecting from http to https in a shop site, which status code should I use?", "text": "On a shop website, when \"Pay now\" is clicked we perform a header redirection to the same URL, just an SSL secured https version. In such a common scenario, should we use a permanent (301), a temporary (302) or any other status code? Somehow, neither permanent nor temporary feels right (though I guess the latter will be more appropriate)."} {"_id": "48890", "title": "Extensive GET /wpad.dat HTTP/1.1 requests", "text": "We are having extensive (thousands per minute) `GET /wpad.dat HTTP/1.1` requests at our Linux server, causing traffic problems and slowing down access to the websites we host. What does it stand for and how to prevent it?"} {"_id": "39327", "title": "Apache Virtual Host does not work", "text": "I've problems setting up a virtual host on windows7, so that I can develop multiple pages on my localhost. To set up the virtual Host, I've edited 3 Files: * **httpd-vhosts.conf** in apache\\conf\\extra * **httpd.conf** in apache\\conf * **hosts** in system32\\drivers\\etc This is what I've done in order to make the page run on Port 81: added this block in **httpd-vhosts.conf** NameVirtualHost *:81 DocumentRoot C:\\xampp\\htdocs\\mypage ServerName mypage DirectoryIndex index.html index.php Options Indexes FollowSymLinks Includes ExecCGI Order allow,deny Allow from all AllowOverride All made apache Listen on Port 81 in **httpd.conf** Listen 80 Listen 81 added this line in **hosts** 127.0.0.1 mypage But as soon as I try to access mypage via localhost:81, I get redirected to localhost/xampp."} {"_id": "19", "title": "Is there a better analytics package than Google Analytics?", "text": "I've used Google Analytics for a while now, and it's pretty good for a high level view. But when I want to dig into the numbers a bit more, everything gets kind of vague. I also don't like that it takes so long to update. Are there any better options?"} {"_id": "45570", "title": "Solution for tracking lead generations and share data with partners", "text": "I need to implement a lead generation feature on a website. Technically I understand how it works (e.g. `Javascript` script associated with the `onlick` event of a button which sends data about the user to a tracking system before triggering the action, such as redirecting user to another website, revealing an email address or phone number...) however what I need is a tracking system which my partners will trust and with which I can share the collected data. I know one can use `Google Analytics` to track actions (such as lead generation) but the problem is that the data sharing features of `GA` are too limited: I can only decide which profiles (i.e. websites) a user can see, but not what for that profile (so it's all or nothing). **Does anybody know about a web analytics solution which match the following criteria?** -allows tracking click events (similar way as `GA`) -enables me to share analytics about 1 particular tracking code only with another user -is free (ideally) -is used by a few big names so that my partners will trust it (ideally)"} {"_id": "25063", "title": "Any solutions for transferring media between Mobile Clients", "text": "Please bear with me as I am a complete beginner at this but looking to learn. I am looking for a service or any advice in a solution to transfer media from Mobile to Mobile. Say I want to send a picture from my cell phone to yours through a server, when that picture is received the data does not persist on the server. Is there any server that accomplishes this or are is there a server setup/configuration that exists to do this? Thank you so much and my apologies if this is the wrong place to post this. Between Stack Overflow, Serverfault and this site. I had no idea which one was the proper place for such a generic question."} {"_id": "55347", "title": "URL rewrite from www.domain.com/sudirectory to http://domain.com/subdirectory", "text": "I need a solution for the following problem: I use a CMS and want the backend only be available at `http://domain.com/backend` and not at `http://www.domain.com/backend`. How do I have to change my _.htaccess_ file to achieve this? I already have a rewrite rule from HTTP (non-www) to www. Here's what I currently have in my _.htaccess_ file: ## # Uncomment the following lines to add \"www.\" to the domain: # RewriteCond %{HTTP_HOST} ^shaba-baden\\.ch$ [NC] RewriteRule (.*) http://www.shaba-baden.ch/$1 [R=301,L] # # Uncomment the following lines to remove \"www.\" from the domain: # # RewriteCond %{HTTP_HOST} ^www\\.example\\.com$ [NC] # RewriteRule (.*) http://example.com/$1 [R=301,L] # # Make sure to replace \"example.com\" with your domain name. ## So, the first bit is the redirect from HTTP to www. It works on the domain part of the URL. As explained, I need a rewrite rule from the backend login at `http://www.shaba-baden.ch/contao` to `http://shaba-baden.ch/contao`"} {"_id": "43879", "title": "How to solve 503 issue", "text": "I'm participating in a web project. It's a Joomla web page and it's gained a lot of visitors in the past months. Actually we are receiving donations each year to pay site's hosting and domain. The concurrence now is extremely high and we don't really want to use advertisement as business model 'cause a philosophy matter. We can not afford a dedicated server 'cause we are a non- profit community. So my question is: Is there any way to solve 503 issue because of the high concurrence without migrating to a more expensive hosting plan ? Can I place the site in different hosts or something cheaper ? Thank you very much."} {"_id": "43878", "title": "Do long geographic lists \"we cover these towns\" work or are they penalised by Google?", "text": "Some small business type websites list their geographic areas covered, often crudely listing 30 or 40 towns, and then a big list of postcode areas. Here's an example for skip hire: http://nottinghamskip.com/skips-delivered-nottingham-mansfield ![enter image description here](http://i.stack.imgur.com/KKjg9.jpg) Obviously you want to match for people searching: > \"skip hire Swadlincote\" Probably this practice did once work. Does it still, or does it get penalised?"} {"_id": "25065", "title": "Migrating Google Accounts into Google Apps", "text": "My company wants to migrate all email and cloud work to Google Apps for Business. We currently have a Google account, however, with our domain appended to the user name. It's my understanding that Google no longer allows setting up accounts like this, and forces a `*@gmail.com` address. So, in this case, we already use `webguy@example.com`. In a previous case where I have migrated to Google Apps in the past, if you set up your account with domain `example.com`, any existing Google accounts with that domain are either smitten or forced to choose new user names. We would like to continue to use `webguy@example.com`, but it's not as simple as exporting email, calendar, and docs, as we also have our AdWords, Analytics, and AdSense accounts tied to this username. So ultimately the question is if anyone knows how to migrate this one account straight into the Google Apps for Business domain either before or once it's set up?"} {"_id": "43874", "title": "How often does GWT check dynamic sitemaps?", "text": "I'm working on a fairly large site, that generates a dynamic sitemap hourly. Now in Google Webmaster Tools the sitemap isn't submitted yet and I'm shying away because I'm afraid that the new content (which appears in the dynamic sitemap) won't get crawled as quickly. So my question is: How often do the GWT check the sitemap once submitted? Any other thing I should be aware of when working with GWT and dynamic pages? P.S. I checked this thread How often are sitemap.xml checked for updates by crawlers? and from what I understand Google crawls more often when the site gets updated regularly - but does the same apply for the GWT?"} {"_id": "43873", "title": "Are there any SEO benefits by using Google Tag Manager?", "text": "Google introduced Google Tag Manager. When following the 4 steps to create it, the keyword and website URL are optional. I created a container without keyword and URL, but it gave me the following script code and said to paste it to all the website pages. What is the benefit for me? \" This code is not related to my domain and keyword."} {"_id": "58735", "title": "How to map requests to an external directory", "text": "I have a directory/application that is located outside of the web root directory of my site. Say the site is here: /var/www/site/htdocs/ And the external app is located here: /var/www/apps/coolapp/ My question is how can I configure nginx to map/route all requests that are like `www.mysite.com/coolapp/*` (asterisk being wildcard) to the external location `/var/www/apps/coolapp/`? For example, www.mysite.com/coolapp/test.php should server `/var/www/apps/coolapp/test.php`. Per @krokola's answer, I tried adding the `alias` directive in the `production.conf` file that the main `nginx.conf` file includes. Here is what `production.conf` currently looks like server { listen 80; listen 443 ssl; ssl_certificate /blah/blah/blah; ssl_certificate_key /blah/blah/blah; ssl_protocols blah blah blah; ssl_ciphers blahblahblah; ssl_prefer_server_ciphers blahblah; access_log /var/log/nginx/www.mysite.com-access.log; error_log /var/log/nginx/www.mysite.com-error.log error; server_name mysite.com www.mysite.com; root /var/www/site/htdocs; include conf/magento_rewrites.conf; include conf/magento_security.conf; include /var/www/site/nginx/*.conf; #-------CODE IN QUESTION------- location /coolapp/ { alias /var/www/apps/coolapp/; location ~ \\.php { # Copied from \"# PHP Handler\" below fastcgi_param MAGE_RUN_CODE default; fastcgi_param MAGE_RUN_TYPE store; fastcgi_param HTTPS $fastcgi_https; rewrite_log on; # By default, only handle fcgi without caching include conf/magento_fcgi.conf; } } # PHP handler location ~ \\.php { ## Catch 404s that try_files miss if (!-e $request_filename) { rewrite / /index.php last; } ## Store code is defined in administration > Configuration > Manage Stores fastcgi_param MAGE_RUN_CODE default; fastcgi_param MAGE_RUN_TYPE store; fastcgi_param HTTPS $fastcgi_https; rewrite_log on; # By default, only handle fcgi without caching include conf/magento_fcgi.conf; } # 404s are handled by front controller location @magefc { rewrite ^(.*) /index.php?$query_string last; } # Last path match hands to magento or sets global cache-control location / { ## Maintenance page overrides front controller index index.html index.php; try_files $uri $uri/ @magefc; expires 24h; } } conf/magento_fcgi.conf looks like this: fastcgi_pass phpfpm; ## Tell the upstream who is making the request proxy_set_header Host $host; proxy_set_header X-Real-IP $remote_addr; proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for; proxy_redirect off; # Ensure the admin panels have enough time to complete large requests ie: report generation, product import/export proxy_read_timeout 1600s; # Ensure PHP knows when we use HTTPS fastcgi_param HTTPS $fastcgi_https; ## Fcgi Settings include fastcgi_params; fastcgi_connect_timeout 120; fastcgi_send_timeout 320s; fastcgi_read_timeout 1600s; fastcgi_buffer_size 128k; fastcgi_buffers 512 64k; fastcgi_busy_buffers_size 128k; fastcgi_temp_file_write_size 256k; fastcgi_intercept_errors off; fastcgi_index index.php; fastcgi_param SCRIPT_FILENAME $document_root$fastcgi_script_name; fastcgi_param SCRIPT_NAME $fastcgi_script_name; # nginx will buffer objects to disk that are too large for the buffers above fastcgi_temp_path /tmpfs/nginx/tmp 1 2; #fastcgi_keep_conn on; # NGINX 1.1.14 expires off; ## Do not cache dynamic content"} {"_id": "19715", "title": "How to develop a good web design in web site", "text": "I have a web development site, but I do better and better design in my site, because I develop a good web site design so please reply me if you know of other ideas. My website is located at http://www.esparkinfo.com/"} {"_id": "20838", "title": "How to find web hosting that meets my requirements?", "text": "_This is a \"catch-all\" question designed to serve as an answer for all questions about choosing web hosting. Pro Webmasters no longer accepts new questions about how to choose hosting. All future questions pertaining to finding web hosting should be closed as a duplicate of this question. For more information about this policy please seethis meta question._ * * * **How to find web hosting that meets my requirements?** What we're looking for in answers to this question are the basics about web hosting: * What is web hosting? * What is the difference between shared, VPS, and dedicated hosting? * How does a content delivery network relate to web hosting? * Anything else you feel is helpful in finding a web host. What we do not want is: * Endorsements or recommendations for specific web hosts * We do not want your experience or other subjective information (just the facts please)"} {"_id": "56206", "title": "VPS and hosting", "text": "NO this is no duplicate of the above. My question is very specific to VPS and is not solely related to hosting but also to virtual desktop and such. The above barely writes 2 lines about VPS and never speak of virtual desktop needs. **CASE:** I'm going to upload a (small) symfony2 website soon. I plan to create another one after and maybe a little wt app ( which forces me to have a vps ). I also would like to have a 100% uptime virtual desktop + storage available from anywhere. for symfony2, I need to be able to manage apc pecl cache extension at the very least. For wt I need root access to compile and run c/c++ code and play with apache. **Questions:** * When it comes to VPS, price differences are huge for similar \"front\" characteristics. Why ? * Given my needs ( website traffic should never exceed 100 dudes daily ). What should I look for ? ( CPU, RAM[guaranteed?], Bandwidth ) Basically I'm looking for one or two guys that have experienced professionally a few VPS and PHP-hosting solutions, and be advised about everything I ask... and did not ask !"} {"_id": "65568", "title": "What dedicated server is for $8 each month?", "text": "What dedicated server is for $8 or less each month and has at least 512MB ram?"} {"_id": "47162", "title": "What are my options for a hosted vps service with shell access?", "text": "Disclaimer (some folks marked this question as a duplicate): I DO NOT want suggestions for things like a hosted wordpress providers, nor am I asking what web hosting is. This question is not a duplicate of: How to find web hosting that meets my requirements? -- I just want suggestions from other technologists about viable alternatives. I've heard about linode which sounds promising. I'd like recommendations for specific web hosts that provide VPS hosting providing a lot of control over the machine (like linode). I want to install packages such as: tmux, python 2.7, a java runtime, node.js runtime, mysql, memcached, in-memory queuing libraries, etc. In short, I want an \"always on\" machine for doing personal development projects, which may lead to commercial purposes in the future. Some specifics and tools I'm looking for are: * cheap! (personal use for now $0 - $30/mo) * shell access (sudo access for package management on my own host VMs) * control over dns * ability to alter cpu/memory/disk resources as needed (depends on the project) * Centos 6.x (or other linux OS that plays nice with AWS) Some web searching led me down a rabbit hole (and annoying sales/marketing pitch). A small list of hosting providers meeting the above criteria and some comparison of them from technologists would clarify the info I need to make an informed decision."} {"_id": "15501", "title": "Need a host which supports OSQA", "text": "> **Possible Duplicate:** > How to find web hosting that meets my requirements? Hi i'm looking to install OSQA and see how it goes I have a great niche which I think may work real well, but till I get a large enough audience I'd like to use shared hosting then move up to a dedicated or vps hosting... Almost all hosts i've looked at don't support something OSQA needs I need relatively cheap shared hosting with cpanel. Any recommendations? It needs to support: * Django * Python markdown * html5lib * Python OpenId * South"} {"_id": "61384", "title": "Any suggestions on free web hosting service with your registered domain name", "text": "I have registered a domain. I am looking for a free web-hosting service where I can attach this domain name to site hosted on that service. Weebly at one point used to provide this service for free but not anymore. You can still host a site for free on weebly but if you want to use domain name then that's not possible for free ( at least to my knowledge). Can any one point me to any such free service?"} {"_id": "51878", "title": "Free Basic Hosting w SSL Certificate", "text": "I'm looking for a free hosting provider which provides an SSL Certificate at no additional charge. Does anyone know of a hosting provider which can provide such a service? Thanks in advance, Amani Swann"} {"_id": "11815", "title": "Best provider for webinar hosting", "text": "> **Possible Duplicate:** > How to find web hosting that meets my requirements? We are looking at running some webinars soon, and I was wondering if you folks had any experience with a service you have been happy with. We are currently looking at using one of the following (but are always up to suggestions): http://www.megameeting.com/overview.html https://www.videoseminarlive.com/index.aspx http://www.webex.com/ We also looked at: http://www.gotomeeting.com But they seem to not support video at this time. Specifically, we need to be able to provide a live video feed from an instructor (as well as display slides), and accept questions/discussions from the audience via audio or chat. Of course, it is always good to have a clean interface, and if the system works without installing any software, that is a plus as well. Any thoughts?? --Edited to provide details for system requirements."} {"_id": "34227", "title": "Free online PHP hosting", "text": "> **Possible Duplicate:** > How to find web hosting that meets my requirements? I have a PHP script that can take $_GET parameters from a URL (i.e. http://www.example.com/test.php?name=george). I'd like to be able to host this script online so that others can pass parameters to it to obtain the returned data. Anyone know of a free PHP hosting site that would allow for his functionality? (PS: I can't host it myself) Thanks!"} {"_id": "154", "title": "Shared vs. Dedicated Hosting", "text": "> **Possible Duplicate:** > How to find web hosting that meets my requirements? What is the difference between shared hosting and dedicated hosting for my website?"} {"_id": "31739", "title": "Scalable web-hosting for a youtube-like service (no, not porn)", "text": "> **Possible Duplicate:** > How to find web hosting that meets my requirements? My business partner and I are looking for a European web-hosting service (we are situated in Europe). That service needs to be, needs to have: * international servers, a server for each continent at the very least. * a high amount of bandwidth. * highly scalable, since we are expecting to start off small, but as our user base grows so will everything else (again, no porn or phallic jokes) need to do. * a moderate to supreme customer service. * of course a small downtime per annum. * affordable at first, fair as we grow. I think that is all. Any input is greatly appreciated. Thank you in advance."} {"_id": "39898", "title": "Which Web Hosting service should choose?", "text": "> **Possible Duplicate:** > How to find web hosting that meets my requirements? I was using a web hosting service (no need to mention its name) that causes me a lot of trouble with my code, especially with loading videos, sounds... etc (you can follow one of my issues caused by the FTP server Here), so I decided to change the web hosting. * So depending on what I know; what's the best? * What would you suggest for me? * What do you think about these: hostgator.com, godaddy.com, ipage.com? * Is the location (Place) involved? (I am working on a website for a Middle East company)"} {"_id": "3505", "title": "Fast, Cheap, Dedicated Hosting", "text": "> **Possible Duplicate:** > How to find web hosting that meets my requirements? I'm looking for dedicated hosting with at least 1GB of RAM that doesn't hurt the wallet too much. Specifically it should have goods speed around the world as well. Suggestions?"} {"_id": "45214", "title": "Mac Mini Colocation Usage Case", "text": "I'm looking at purchasing colocation services for a Mac Mini with A specialist colocation company - but as of yet I have no experience with OSX Server for hosting websites. I'm looking for a less involved way of hosting up to 80 small (1-20 page, <1000 daily hits) WordPress websites for my clients. All of the websites use caching aggressively. Does anyone have any experience using OSX Server for WP hosting / any ideas as to how many sites it can comfortably (ie minimal loss of speed) handle? My intention is to purchase one with an i5 processor and 16GB of RAM."} {"_id": "54034", "title": "Hosting of static content without server, only through CDN", "text": "My content is fully static, using only HTML, CSS, JS. It's only one page, which needs to load really fast from any country, and should be online for only 3 months. I already have the domain name. I just need a CDN where I would keep my HTML, CSS, JS and image files, to be served really fast in Japan, China or USA. Do I still need to have a web server? Do CDN provider also offers simple hosting?"} {"_id": "30180", "title": "What now... Godaddy host not support charting in asp.net", "text": "> **Possible Duplicate:** > How to find web hosting that meets my requirements? I was going to use the asp.net charting with Visual Studios 2008 but GoDaddy hosting does not support the charting cause they won't install the component. So what are my options for charting/graphs for data? Looking for free if possible. Is Telerix the best?"} {"_id": "11248", "title": "What are the options for hosting a small Plone site?", "text": "> **Possible Duplicate:** > How to find web hosting that meets my requirements? I\u2019ve developed a portfolio website for myself using Plone 4, and I\u2019m looking for someplace to host it. Most Plone hosting services seem to focus on large, corporate deployments, but I need something that I can afford on a very limited budget and fits a small, single-admin website. My understanding is that my basic options are thus: 1. I can go with a hosting service that specifically provides Plone. I know of WebFaction, but what others exist? Also, I\u2019d have two stipulations for a Plone hosting service: (a) It needs to use Plone 4, for which I\u2019ve developed my site, and (b) it needs to allow me SSH access to a home directory (including the Plone configuration), so that I may use my custom development eggs and such. 2. I could use a VPS hosting service. What are my options here? Again, I need something cheap and scaled to my level. 3. I could use Amazon EC2 or a similar service (please tell me of any) and pay by the tiniest unit of data. I\u2019m a little scared of this because I have no idea how to do a cost-benefit analysis between this and a regular VPS host. The advantage of this approach would be that I only pay for what I use, making it very scalable, but I don\u2019t know how the overall cost would compare to any VPS host under similar circumstances. What factors enter into the cost of Amazon EC2? What can I expect to pay under either option for regular traffic for a new website? Which one is more desirable for when a rush of visitors drive up my bandwidth bill? One last note: I know Plone isn\u2019t common for websites for individuals, but please don\u2019t try to talk me out of it here; that\u2019s a completely different subject. For now, assume I\u2019m sticking with Plone for good. Also, I have seen the Plone hosting services list on Plone.org\u2014it\u2019s twenty pages long, and the first page was nothing but professional Plone consulting services that sometimes offer hosting for business clients. So, that wasn\u2019t much help. Thank you!"} {"_id": "10951", "title": "Is there a free Django hosting service out there?", "text": "> **Possible Duplicate:** > How to find web hosting that meets my requirements? I would like to put my Django web-site online. Is it possible to do so for free?"} {"_id": "36180", "title": "Recommanded cloud/cdn for my website?", "text": "> **Possible Duplicate:** > How to find web hosting that meets my requirements? Currently we are using Cloudflare (Business) and it's great, except that our website is crushing for around 1 minute every 2 hours on average, and it seems they don't like to give us support. So, we are looking for alternative. I thought about Incapsula. Currently we have around 1mil pageviews daily. Can you guys please recommand for us what to use/do? Thanks!"} {"_id": "14982", "title": "MySQL server with website hosting with managed hosting", "text": "> **Possible Duplicate:** > How to find web hosting that meets my requirements? I work for a small business, and I _am_ the \"IT Department\". I also happen to be a summer intern, so after I leave the number of IT staff will be 0 for an indefinite period of time. I need a place where I can reliably put up the company's website as well as the mySQL+PHP backend. Good redundancy is a plus, as well as easy administration for my IT-challenged colleagues. Managed hosting would be good, so the PHP versions can update without my company having to hire an admin. **EDIT** : The company already has a mySQL+PHP server running locally which hosts the existing website. My assignment is to find a remote server where the latest versions of mySQL+PHP will be maintained and where there is a very small chance of unintended downtime. Can you recommend anything?"} {"_id": "25362", "title": "I have a linux image how can I find somebody that will host it?", "text": "> **Possible Duplicate:** > How to find web hosting that meets my requirements? The image has a site running on php 4 using some archaic CMS + some version of My Sql + etc. I need to find a host, but I am not sure if the hosting providers will host an old image I don't know who the previous host was, just being handed the image by the previous developer. PS : If there is an obvious solution that I am missing ( pardon my ignorance, I have not done this before, I am just a windows developer )."} {"_id": "20856", "title": "Which Hosting Plan Should I use for this Web Application", "text": "> **Possible Duplicate:** > How to find web hosting that meets my requirements? I am developing a Web Application [PHP & MySQL] which is basically a rating website ( a social bookmarking site ). There are many tables in the Database like: Users, Notifications, Posts, Following and more. Which Hosting Plan should I use to host it : Shared or VPS. And yes, I don't think it needs a dedicated web server. If traffic grows, I will upgrade the plan to dedicated. One more thing: I am not using any CMS or Framework. I am writing the code myself and I will also try to make the website faster ( and lighter on the server ) by hosting CSS and JS files in DropBox. I will buy the hosting plan from Hostgator (India) : is it good or should I go with another company?"} {"_id": "16050", "title": "Is there any recommended windows web hosting?", "text": "> **Possible Duplicate:** > How to find web hosting that meets my requirements? My plan is to deploy a web site based on Asp.net + MS SQL server. I am not familiar with the windows web hosting market. Can anyone provide some suggestion? Thanks a lot."} {"_id": "33087", "title": "Where can I host my app?", "text": "> **Possible Duplicate:** > How to find web hosting that meets my requirements? I'm creating an web app which is PHP based. Some more info about app: * Image processing (not heavy processing - GD extension requried / maybe ImageMagick later) * It'll mainly display images * MongoDB is required since app is built on top of it * Expected tons of people after few days after launch - easy scaling * PHP 5.3+ is required * European hosting is PLUS I want high quality service and uptime as high as possible, but since I'm 17 yo, I cannot pay too much. I also don't want shared hosting. I've checked phpfog.com / appfog.com and It seems good but it's a bit too expensive (~29$/month app fog, later probably 79$/month and then there is mongo hosting which is 15$/month and later probably 49$+). I've heard for other hosting providers such as Linode, Amazon web services... But I don't really know which one to pick + what I'll have to know to set everything up. What are my other options? I've never had VPS (I have my own server home and all testing is done at home), and I probably wont know how to set everything up (postfix, failovers...). What can you suggest?"} {"_id": "33887", "title": "Difference between shared hosting and PAAS?", "text": "> **Possible Duplicate:** > How to find web hosting that meets my requirements? They seem very similar. Are there important differences I should be aware of?"} {"_id": "18480", "title": "EU Python hosting", "text": "> **Possible Duplicate:** > How to find web hosting that meets my requirements? I'm looking for Python hosting in EU. I need WSGI support (nginx+uwsgi, apache+mod_wsgi or similar). What are the options, excluding VPS/dedicated server?"} {"_id": "42175", "title": "Place to host picture/sound gallery", "text": "I have the following problem. I own a wallpaper/ringtone collection mobile app and I need a place to host the collection itself. So I need a hosting with following parameters: * 1-2 GB disk space * large/unlimited traffic (it will be media-file traffic) * large/unlimited uniq users Can anyone help please?"} {"_id": "6495", "title": "What should I use for storage for a photo-sharing website", "text": "> **Possible Duplicate:** > How to find web hosting that meets my requirements? I'm building a photo-collection/gallery sharing web-app. I need to know what I should consider for storage. I only have a few gb on my website. And it's not possible for me to get the amount of storage I need, the same place I have my website. Would it make sense to use a cloud-service for this purpose? Is it possible to use some online storage service like hotfile or the likes? What would be best? I'm thinking at least 20-30 gb of space is needed. Thanks in advance."} {"_id": "12743", "title": "Looking for recommendation for PHP app hosting with (relatively) painless scalability", "text": "> **Possible Duplicate:** > How to find web hosting that meets my requirements? We're a pre-launch startup with a LAMP based application. We're looking for recommendations for hosting services for our upcoming beta launch. Our main requirements: 1. LAMP support 2. Ability to schedule/start daily/hourly server-initiated batch jobs against our databases. 3. Relatively painless scalability when we grow - adding servers/databases/bandwidth without having to re-architecture everything or cause major outages. 4. Support for Wordpress Pointers will be appreciated."} {"_id": "54756", "title": "I need a flexible VPS like gandi.com", "text": "What are the alternative? I do not ask for which one you recommend. I am simply asking for list of similar service. That being said if there is any you've used and happy with please let me know. This is what I am considering http://en.gandi.net/ Some says it's overpriced. It's like amazon aws I think. So I need VPS hosting where I can easily change the CPU requirement, etc."} {"_id": "48242", "title": "Advice scaling website", "text": "We could use advice on a scaling/ops issue. We have a simple website that runs on Rails 3.2.12 and uses MongoMapper instead of ActiveRecord. There is one database call that sporadically performs poorly, causing users to write in and complain. It isn't clear why. We can't install NewRelic because of MongoMapper, and the data returned by Mongo isn't a lot. There isn't much logic being executed in the controller, either. One potential explanation is that we use a VPS shared with 30 nodes. The hosting company, RailsPlayground, says the machine's average I/O utilization is only 12% but can't provide more extensive stats (e.g., peak I/O utilization). The question: would moving to a dedicated server help? I realize this is difficult to answer, but any general thoughts/advice would be appreciated."} {"_id": "26774", "title": "Specific hosting or virtual machine?", "text": "> **Possible Duplicate:** > How to find web hosting that meets my requirements? I'm searching for hosting for the back-end and web client of an application that uses node.js and mongodb on the back-end and PHP on the web client. There are many options for hosting node: * Heroku * Nodester * Joyent For PHP nearly all hosting options are able of rendering PHP, for mongodb the node hostings allow databases as well. The project will have low usage, would it be better to use a VPS hosting where I can install all the software needed (PHP, Node and Mongo) like AmazonsEC2 micro instances? Are there any good alternatives to Amazon EC2?"} {"_id": "11356", "title": "Clustered Web host", "text": "> **Possible Duplicate:** > How to find web hosting that meets my requirements? I live in Denmark in the daily and need to find a web host that can keep up at the world level. Should there be some who is familiar with hotels that can follow the requirements below. **Overall** * Clustered Web * 24/7 Expert Helpdesk and Server Monitoring * Linux Operating System * Unlimited Subdomains * 50 MySQL Databases * FTP and FTPS Access * Fast connectivity to any destination - World Wide. (Stable/Low DNS, TTFB and similar) **Management** * Full DNS Management * Web Mail Access * phpMyAdmin * Cronjobs **Mail** * IMAP/POP3/SMTP * Mail Auto Responders * Catch-All Mailbox * Microsoft Exchange Enabled **Apache** * Python and perl CGI Support * Secure Server (SSL) * mod_rewrite * Full .htaccess Support **PHP** * PHP v 4.4 & 5.2 * ImageMagick and GD * CURL * RTF, POWERPOINT, EXCEL, WORD and PDF parser * Zip Utility * xhprof"} {"_id": "18389", "title": "Good service to host CPU intensive web application", "text": "> **Possible Duplicate:** > How to find web hosting that meets my requirements? Currently, I am a happy user of google app engine. However, I am planning to host a CPU intensive web app (which is going to apply all sorts of signal processing algorithm), is there any other good service which is available? As I realize google app engine has placed restriction on per minute CPU time. http://code.google.com/appengine/docs/quotas.html#Per-minute_Quotas"} {"_id": "6938", "title": "Hosting plans that allow more then 1 GB for mysql and easier mod_rewrites?", "text": "> **Possible Duplicate:** > How to find web hosting that meets my requirements? I'm using godaddy and am thinking of moving to another hosting company that allows easier mod_rewrites and a bigger limit for a mysql database. Does anyone know of a better hosting company that does this?"} {"_id": "13537", "title": "I need a cheap, reliable, virtual webserver service", "text": "> **Possible Duplicate:** > How to find web hosting that meets my requirements? So, we used linode.com for a while, and that was awesome, but I need my own virtual server now, and the problem with linode is that you had to install everything from scratch. I'd like a server I can get my hands on with a default installation of everything necessary, the ability to use root and all that (that said, I'd prefer linux... though I have experience with windows server, just don't prefer it for my webhosting), and any kind of decent management tool online. I don't want to spend more than $20 or $30 a month on that, and that is super important. If I can't get a good VPS, what is a good site for hosting that isn't GoDaddy? I've only used GoDaddy, client's servers, or my current webserver on linode. Thanks everyone!"} {"_id": "26667", "title": "what is the best dedicated hosting provider for a file sharing site?", "text": "What is the best place to host a file sharing site? What is dedicated linux hosting and windows hosting? What do I need to start a file sharing site?"} {"_id": "23942", "title": "How can I choose a reliable dedicated or semi dedicated Windows hosting?", "text": "> **Possible Duplicate:** > How to find web hosting that meets my requirements? # How to find your hosting company 101 for dummies _There's a TL;DR at the end._ ## Long version I've been asked by a friend who understands nothing about information technology to help him from being scammed. This person runs a family-size retail company through an e-commerce, developed and hosted by the same web agency. He no longer trusts this web agency, and wants to get away from them; the first step he wants to perform is to move the e-commerce from the agency's servers to an external hosting, and here is where I've been involved. I'm a web developer with some background in networking an system administration, and I understand the difference between VPSes, dedicated servers and managed/unmanaged, but I never went through the burden of choosing a (semi) dedicated hosting for a site you live off. I don't necessarily have to give him a (or more than one) name, but at least I need to give him a range of prices and kinds of services; things like maintenance of the web site or the machine is not and such are not part of the discussion, at the moment. My \"analysis\" should include a \"minimum\" and a \"maximum\" price, and the services these prices offers, both for VPS and dedicated server solutions. I've been searching around for a couple of time and found some hosting companies, I also tried to look around for reviews and comments about them but it's a daunting task, and I got more confused then when I started; I've found everything and its opposite, who should I trust? If it matters, his e-commerce is spec'ed like this: * Classic ASP * SQL Server 2000 * A couple of IIS and ASP plug-ins * Around 250 MiB of space * Runs on a 6 MiB line with other ~30 sites * I do not know his average daily traffic, but definitely it's not Yahoo or Facebook ## Short version I'm here, fellow webmasters, to hear your opinion: * Are there on the net some reliable hosting companies comparison sites? What I found so far looks quite amateurish, and doesn't provide mush info besides a ranking. * Are there some reliable sites where I can find reliable reviews of the hosting companies? What I found so far is forums filled by differently technical flaming users and their signal-to-noise ratio is awful. * How can I tell a good hosting company from a bad one? What should I look for? Things like \"unlimited\", \"unmetered\" and such things are the basics, what's the next step? * Remember, this question is not about which hosting company is \"right\", but how can I find a reliable one (I'll evaluate which are \"right\" after, when comparing their offers). P.s. I deliberately omitted the link to the sites I found to avoid bias, if they should be included I'll gladly add them."} {"_id": "11112", "title": "Java hosting service provider?", "text": "> **Possible Duplicate:** > How to find web hosting that meets my requirements? I want to host a Java web application. How to decide which hosting service provider to choose from. Is there any source of info where from I can get such info. Yes most important, being from India which service provider service should I use, from those who say there servers are located in US or they who say their servers are located in India itself. Thanks in advance from any suggestion/reply to this query."} {"_id": "19585", "title": "Hosting rails sites; vps or shared, and how much ram?", "text": "> **Possible Duplicate:** > How to find web hosting that meets my requirements? I have 3 rails sites to launch, all of which are fairly small and consisting of a custom cms, one with an online store, and 2 sinatra sites which are mainly static, portfolio sites. What would be the best way to host these sites (I've deployed on dreamhost shared before and some vps's) Is it best to manage them together under one vps? e.g linode $20/m (for the cheapest option, 512mb and would that even be enough ram?) or keep each rails site separate and host each one on a small vps? e.g $4/m (there's often lots of deals like this on webhostingtalk) I'm currently hosting the sinatra sites for free on heroku but finding it a bit slow sometimes."} {"_id": "31022", "title": "$5 File Hosting with API", "text": "> **Possible Duplicate:** > How to find web hosting that meets my requirements? Could anyone suggest a reliable file hosting service for personal use with developer API that will not cost more than five dollars. Thank you"} {"_id": "25648", "title": "What's the most reliable FREE web hosting for WordPress blogging?", "text": "> **Possible Duplicate:** > How to find web hosting that meets my requirements? I'd like to hear the opinion of those who have been using such a service for at least half year or more. I have my own registered .com.ar domain. It would be great if it is ad-free also. Thanks in advance for your sharing your experience."} {"_id": "54962", "title": "What to look for in a free hosting plan?", "text": "I have a test website that's hosted on a free plan by Zymic. At the moment I'm typing, it and my site is down. I don't want to let my clients down in the future. It's been down for over 2 days. I thought it was a coding problem at first, and then found out I couldn't connect to my server. Zymic had very good reviews, and its downtime was OK (not high or low), but now I want to change my web host. What should I look for (besides downtime guarantee)? Also, do you have any suggestions that with all the benefits? Any feedback will be greatly appreciated."} {"_id": "13364", "title": "What hoster offers automatic Wordpress upgrades (like Dreamhost) and _also_ spreads your sites over range of unique IP addresses?", "text": "> **Possible Duplicate:** > How to find web hosting that meets my requirements? I just love the One-click-install and automatic Wordpress upgrades at Dreamhost. However, I am looking for a hoster where I can be sure that my sites end up with different IP addresses (for SEO reasons). Dreamhost does not offer that option, you have to buy unique IP-addresses for every site if you want to do so, which is expensive. I do not care that it is shared hosting, meaning that my sites share the same IP address with other sites, as long as they're not on the same IP themselves (and yes, preferable not within the same IP range). Do you know of a hoster who offers the Wordpress managing comfort of Dreamhost yet lets you spreads your sites over multiple IPs? (note: inspired by a similar discussion at this Dreamhost forum: http://discussion.dreamhost.com/thread-129719.html"} {"_id": "56341", "title": "Coldfusion hosting in New Zealand", "text": "I am looking for a NZ based hosing company. Can anyone advise a good hoster with Coldfusion support, SSL, based in New Zealand? Thanks!"} {"_id": "59602", "title": "Recommendations for small hosting solution with lots of support", "text": "I maintain a personal website though have never put much effort in building up the front-end. I really just use it as a testbed for various projects I'm working on or tinkering with. For a few years I've maintained an account with Hostgator but they've gotten to the point where I'm rather fed up. Anyone have recommendations on best alternative? I don't need a ton of space, or bandwidth, but I do need full ssh access, ftp access, php, perl, mysql, etc.. the basics.., and the ability to modify or change the php installation and install various packages (ruby, php, python is an extra plus) and I don't want to pay an arm and a leg. Doesn't exist.. yeah, I know."} {"_id": "6425", "title": "Are there any Shared Web Hosts that provide access to run Windows Services?", "text": "> **Possible Duplicate:** > How to find web hosting that meets my requirements? I'd like to be able to have a very lightweight Windows Service call some code on the Web Server at regular intervals. Do I have any options besides a Dedicated/Semi-Dedicated Server?"} {"_id": "6203", "title": "Free ASP.NET & MS Access support webhosting", "text": "> **Possible Duplicate:** > How to find web hosting that meets my requirements? I'm looking for a free webhosting plan to test my website before buying someone. Sopport for : ASP.NET 3.5 & MS Access 2003. Thanks."} {"_id": "5643", "title": "Free webhosting Vs Paid webhosting", "text": "> **Possible Duplicate:** > How to find web hosting that meets my requirements? I would like to know the pros and cons of free webhosting vs paid webhosting. What are the factors due to which we should move on to paid web hosting?"} {"_id": "4908", "title": "linode.com/slicehost.com/vps.net what to chose?", "text": "> **Possible Duplicate:** > How to find web hosting that meets my requirements? I am looking for a new VPS for http://hotelpublisher.com. At the moment it is either linode.com, slicehost.com or vps.net (alternatives are welcome). Since I already use Google cloud to deliver data, my priority is ram/cpu/reliability/price. Can anyone advice which of the VPS providers is the best in their opinion and why?"} {"_id": "23668", "title": "looking for web hosting or web dedicated servers services, not sure", "text": "> **Possible Duplicate:** > How to find web hosting that meets my requirements? I am building a website for a church, this websites uses mysql php and for list of members and savng events, etc... The problem is that i havee no much expirience publicating sites. i need web space to publish this site. and this web space have to be php and msql compatible. please help, any info will be much appreciated. looking to spend no more than 30.00 a month thanks"} {"_id": "39357", "title": "Github download is deprecated", "text": "> **Possible Duplicate:** > How to find web hosting that meets my requirements? I'm using Github for a Open Source project, we have also used their file system so that users can download our binary installers and not just the source code. But downloads are now deprecated https://github.com/blog/1302-goodbye-uploads Whats the best free service for hosting downloads? Analytic features are a plus but not a requirement. **Update:** Someone closed this question even though the linked question has nothing todo with Github. Anyway, the answer is that Github now support a feature called releases. https://github.com/blog/1547-release-your-software"} {"_id": "13367", "title": "is there any free java(grails)(tomcat) hosting server?", "text": "> **Possible Duplicate:** > How to find web hosting that meets my requirements? I want to host my grails application in net for testing ... is there any free server sites? supports grails,mysql"} {"_id": "5641", "title": "The best web hosting option to start off on", "text": "> **Possible Duplicate:** > How to find web hosting that meets my requirements? I'm extremely new to hosting, and i've just finished developing a site for my friend. If you were to select a web host with great support and affordable price, which one would it be?"} {"_id": "28381", "title": "Website Hosting/Registration", "text": "> **Possible Duplicate:** > How to find web hosting that meets my requirements? I am planning to launch a website down soon. I wanted to know what solutions are available for hosting and registration. * Starting with domain registration. Any site you have used/preferred ? I am considering either godaddy or 123reg. Does it even make any difference which you choose? Is there any fine print i need to worry about. I am based in UK , not sure if that helps in resolving any issues if encountered. * Does my hosting need to be done at the site i purchased my registration? If not , will there be any transfer fees if i change my hosting? Can I just register the name now and worry about hosting later? * At the moment, I plan to have it up and running using either some sort of a tool or a template and perhaps put the bells and whistles down the line. I understand 123 has its own builder tool available, There are a few solutions suggested like wordpress,drupal & jhoomla... I am a C++ developer , not a web programmer, but I do feel the need to open the hood up and make changes if i see fit. So I guess I am looking for a solution where I can easily drag-drop widgets I need and when the time comes customize it. Which CMS would you recommend. * Extras: What extras do you need to get , I was suggested to get hold of whois privacy to keep the spambots away, anything else you guys would recommend I keep my eyes open before I sign the dotted line."} {"_id": "34591", "title": "Hosting advice for a write-heavy dynamic website", "text": "> **Possible Duplicate:** > How to find web hosting that meets my requirements? I have built a website using PHP and MySQL and now I am looking for a hosting service. I am expecting about a 1000 users registering and about 5-10k pageviews/day in a week's time. So which host should I opt for? It will let users submit contents of type blobs and submit around 10 pictures per users. I hope that traffic will increase so can justhost's or bluehost's shared hosting serve that purpose or should I go for more dedicated ones. Basically the site is write heavy and there are average 2-3 MySQL queries per page and it is quite dynamic. So depending on these requirements which web hosting will be optimal for me."} {"_id": "27728", "title": "Dedicated Servers: Is one better then two for LAMP pseudo HA setup?", "text": "> **Possible Duplicate:** > How to find web hosting that meets my requirements? I know there are zillions of commentary about hosting out there, but I haven't read much about this. Our current well known host is having too many problems, the hardware we are on it subpar, and I'm ready to leave. A day of downtime can cost as much as our monthly hosting bill. A month of bad performance is just killing us right now, user and google wise. I'm wondering about running two dedicated boxes for LAMP, one running as the primary Nginx/Apache (proxy pass), and the other as the MySQL box. Running a single box scares the bejesus out of me because who knows how long it will take anyone to fix a raid card or whatever. The idea is to set this up using some sort of failover system using pacemaker and heartbeat. If one server goes down the other can take over for the other running both web and db. There are some good articles over at Linode about this. I have a few DBs that are 1GB+ and would like to load them into memory. Because of this, I'm shying away from a Linode HA setup because for the price I could do it with two dedicated like I described. Am I mad or an idiot? What are people out there doing for pseodu high availability good performance setups under $400/month? I'm a webmaster; I do a lot of things none of it that well :)"} {"_id": "5098", "title": "Need Sql Server Hosting 50GB or More", "text": "> **Possible Duplicate:** > How to find web hosting that meets my requirements? I am looking for a Hosting solution (Dedicated or Shared) which will allow me to host a SQL Server database service (Not SQL Express but the Web edition). The size of my database might grow to 50GB or more. The web application will offer more reads than write operations. I also need daily backups and raid 1 storage. Is there a reliable and economical hosting company that would provide this? Additional Question: If there is a easy way to host MS SQL on Amazon EC2 service, it will be preferable."} {"_id": "59974", "title": "Is there a way to compare the speed of potential webhosts?", "text": "I want to get VPS hosting, but don't know according to what should I choose, especially how do I know if its speedy enough?"} {"_id": "63513", "title": "Visual Studio 2012 C# web application to hostgator?", "text": "I have a web application I developed in visual studio and I am looking to get it hosted on a domain I registered. The server side of the application is in C#. Is there a way I can get this application hosted on hostgator? How would I go about doing that? If not, what are some options I have?"} {"_id": "67525", "title": "Best Free Webhosting with MySQL & outside access?", "text": "I am looking for some free webhosting to test some programs on. Need the following: Ability to point my DNS to it PHP & MySQL Allows outside connections (ie: I can access MySQL from my own computer using Java or Python) More bandwidth the better! FTP Support Thanks for your help."} {"_id": "7712", "title": "Help selecting dedicated server with good disk I/O & network", "text": "> **Possible Duplicate:** > How to find web hosting that meets my requirements? I am looking for a cheap dedicated server. (I was earlier happy with VPS, until I realized that the disk I/O is not at all reliable and depends on what your neighbours are up to at the moment). I was browsing through http://www.lowenddedi.net/the-database I don't understand memory speed and NIC speed columns at all. What will be their affect? Do I need to worry about them? Also, can someone help suggest a provider, with following criteria: 1) Good & reliable Network 2) Price <= $60/month."} {"_id": "54455", "title": "The absolutely cheapest way to host my tiny website?", "text": "What I need: a domain name, 10MB of disk space, 1GB bandwidth, php+mysql is there anyway I can get this all for under $1 a month?"} {"_id": "63115", "title": "What Webserver Do I Need For A Social App", "text": "I am making a social application that its base work is like twitter. When user opens app,it fetches data(texts and images) from server and listitems will be filled and users can be followed and etc. I think number of users for it is up to 500,000 and i want to use `node.js` for server side because it's faster that php. How should i find best `webserver(VPS,dedicated,...)` for this application?"} {"_id": "9427", "title": "HTML/CSS website hosting", "text": "> **Possible Duplicate:** > How to find web hosting that meets my requirements? I would like to host my HTML/CSS website (actually, just one page, but it may grow). Since I don't need PHP, MySQL or such things, I don't want to spend much. Is there anything out there? Thank you."} {"_id": "6339", "title": "Best .Net 4.0 web hosting", "text": "> **Possible Duplicate:** > How to find web hosting that meets my requirements? After dealing with problematic web hosting company, I have to move to a new one. Which do you think is the best for .Net? TY"} {"_id": "3504", "title": "Looking for new hosting after Dreamhost", "text": "> **Possible Duplicate:** > How to find web hosting that meets my requirements? I'm looking for new personal hosting after using Dreamhost for a while. I'm looking for hosting which has the following features: * Shell access (obviously) * Prebuilt LAMP stack (optional) * Python/Ruby support * Scalable * Good speeds around the world (I travel a lot) * Not too expensive? What are you using currently for hosting and why do you like it?"} {"_id": "14995", "title": "how do i choose a web host with support for mysqli extension in php?", "text": "> **Possible Duplicate:** > How to find web hosting that meets my requirements? I have designed a website running in php and mysql. And I used mysqli extension instead of the normal mysql extension in php, so all my functions are now on mysqli. After developing a major part of the website, i started looking for suitable webhosts, and found that there are few that offer support for mysqli extension. I'm not sure if I really made a mistake by preferring mysqli instead of mysql. Could you please let me know - a) whether mysqli support has to be confirmed with the host before I sign up for their web space account b) whether mysqli support will be given by default in all php 5.2+ versions c) any good (paid, but not too expensive) webhosts with reliable service offering support for mysqli extension (in shared server more, not dedicated server) NOTE - mine is a small business, and i'm looking for a shared mode web host for now. Thanks!"} {"_id": "19851", "title": "Looking for a CDN", "text": "Most of the CDN's that I've seen require you to upload your content in advance. I'm looking for a CDN that, upon receiving a request for a resource it hasn't seen, will contact my application server. If the application server returns something, it should be sent to the user and then cached in the CDN. If not, it should just return a 404. If the user requests an unexpired item, the CDN should just serve it without bothering my app server. Does anything like this exist? Is there a way to get Cloudfront to work like this?"} {"_id": "30709", "title": "One domain multiple web hosting?", "text": "> **Possible Duplicate:** > How to find web hosting that meets my requirements? I have a website that only supports PHP. What I'm thinking is that I want to purchase another web hosting and that supports Ruby Rails and Python. Is this possible?"} {"_id": "18297", "title": "Hostings support SSH with price lower than 5 USD", "text": "> **Possible Duplicate:** > How to find web hosting that meets my requirements? I am looking for some hostings support SSH with price lower than 5 USD. Thanks"} {"_id": "59229", "title": "Recommendations for safe and good hosting sites for a personal website?", "text": "Can someone provide some recommendations for safe and good hosting sites for a personal website? What is github that I hear about? Can I host my basic website on github?"} {"_id": "59732", "title": "Could you Advice me: Do I have to pick a VPS?", "text": "My plan: 1) Create Website 1 on Main Domain = www.example.com which is based on the Joomla CMS 2) Create Website 2 consisting of OpenAtrium (Drupal) on Sub-Domain = start.example.com 3) Create Website 3 consisting of Moodle (Open Source Learning Platform) on Sub-Domain 2 = learning.example.com Was looking for a cheap UK based server with a PHP 512MB limit."} {"_id": "29935", "title": "Hosting a small e-commerce website from home. Recomended?", "text": "> **Possible Duplicate:** > How to find web hosting that meets my requirements? I live in the UK and I current use JustHost to host several domains. I think they are very reasonable but recently one a the domains came under a massive SPAM attack which flooded the MySQL database and because the activity it causes went again their Terms Of Service they subsequently suspended my account. Admittedly, it is not their fault but we have struggled to contact them to get the account reactivated due to the difference in time zones, as they are based in the USA. Because I host websites for other people and I am going to be launching an e-commerce website soon, my dad is quite concerned now and is suggesting we move to a better solution. I'm hoping that such an incident wouldn't happen again, but because I'm running a business I can't really take that risk - I need to eliminate as many possibilities of the sites going down as possible. AND if there is an issue at anytime, we can't do with different time zones preventing it from being resolved as soon as possible. We already have a server and a computer that are running 24/7 anyway and we use Virgin Media with a 60Mbps connection so my dad thinks we could host the sites ourselves, especially if we buy a UPS. I must admit that the idea is attracting, but I'm just worried about the load because I have no idea how many people will be using the websites. At the minute there isn't too many visitors, but with the promotion I going to do, I'm hoping the numbers will be in the thousands at least. I'm no expert at web performance or anything so please forgive me. However, I can't see there being much more than 15GB files any time soon, and there is very little anyone can download. The main footprint is the large MySQL databases that the websites relay heavily on then. So, can anyone give me there advice/opinions on what you think I should do. If you have any experience with this sort of think I'd be interested to know also. We have already decided that at some point we would need to move to, at least, a dedicated server but I'm not sure what to do know. Obviously, I don't know much about actually running a web server but I think I am pretty literate and I could easily learn - to me it is just a case of the logistics of everything etc. Thanks in advance"} {"_id": "44720", "title": "Hosting solution for digital content only", "text": "I have a client that is wanting to provide free downloads for their digital content but at this time they're wanting a free solution for the startup. I have seen Amazon s3 services but its paid. I am unaware of any other solutions for hosting securely content that can be linked to a website for download. They are wanting the download to be called from an external source but processed still from their site, if that helps. I told them SourceForge may be a solution but they didn't want their viewers leaving the main site."} {"_id": "12421", "title": "What is the best free hosting provider for my site?", "text": "> **Possible Duplicate:** > How to find web hosting that meets my requirements? I have a small \"hobby\" site that generates almost no income for me (but still is fun to run, and that's why I keep it around). The site get's about 75 hits a day, though I could easily see that increasing to 200 or more in the future. I want to move it to a free hosting provider. The hosting provider should support the latest versions of php and mysql, should be able to run wordpress installations, and should have pdo and sendmail enabled. Does anyone have any recommendations? Thanks :)"} {"_id": "8365", "title": "Could you suggest me the cheapest and most \"basic\" hosting/server solution for my website?", "text": "> **Possible Duplicate:** > How to find web hosting that meets my requirements? My website (a soccer portal, containing Joomla and SMF) in the last months has grown so much that the shared-hosting service I bought about three months ago has decided to rescind our contract and pay the remaining nine months (since the contract is for one-year). I already searched on the net and on stackexchange too, of course, but I didn't find a solution to this problem. The website's load is about 280'000-300'000 hits/month. That's about 10'000 hits/day. This is the main reason they rescinded the contract: we exceeded the number of processes allowed: no other reason exists. On the web, I read they all say that \"if you use a lot of resources, you need a dedicated/virtual server\" (something like a VPS). Ok, so I've searched for a VPS, but I've found they are all too expensive! The cheapest I've found (on this Q&A board) is Linode, that's about 20$/month. That could be a very good solution, but I've still some doubts. My question is: granted that some friends of mine are system analysts, so I wouldn't spend any money to administrate the server, can you suggest the cheapest and most basic hosting/server solution (I mean VPS, dedicated- hosting, shared-hosting, cloud...) for my problem? As I've already said, our website uses two technologies: Joomla and SMF, so we need the most basic solutions. We don't offer emails to our users, we don't need subdomains, we don't need special technologies as Zope, Tomcat and so on. The most basic: PHP/MySQL. And _if exists_ CPanel, just for ease. Linode is the best I've found, but do other solutions - according to my parameters - exist? Thank you very much."} {"_id": "9061", "title": "WebHosting solutions allowing self-made Apache modules", "text": "> **Possible Duplicate:** > How to find web hosting that meets my requirements? I am searching for a web hosting solution that would allow me to use my own Apache modules written in C. Have you any idea who would offer that ? I have dupplicated that question from ServerFault.com, maybe this question is more appropriate here."} {"_id": "5441", "title": "Free java hosting?", "text": "> **Possible Duplicate:** > How to find web hosting that meets my requirements? Anyone know url of free java hosting? Thanks!"} {"_id": "9010", "title": "Recommend hosting with fast MySQL database please", "text": "> **Possible Duplicate:** > How to find web hosting that meets my requirements? I am frustrated to no end with my current hosting provider, mediaTemple. Yes, they are flashy, and have some decent degree of flexibility with their GS plan, which I have. But anytime I install a site that needs a database, it is slow. like really slow. Taking anywhere from 10 - 15 seconds just to load a page. I would host in house, but there are a lot of complications that come with a LAMP server that I don't want to deal with. Honestly, I'd rather spend the time developing. What can you recommend?"} {"_id": "25112", "title": "Podcast bandwidth", "text": "> **Possible Duplicate:** > How to find web hosting that meets my requirements? **The situation** Hi folks, I host a medium sized non-commercial audio-only podcast available through the standard ways (website streaming, itunes, stitcher, zune, etc). All these ways of listening are hotlinked back to a simple ftp server running on a VPS. We've recently been asked to leave that VPS because we're averaging 50GB per day of traffic. **The problem** I'm looking for a new method of hosting these media files. Transfer speed doesn't matter much, so long as people don't get refused a connection, but overall bandwidth per month does. Also, all the actual site hosting, rss, and other podcasting stuff is already handled and we're not looking to change it. Literally all we need is a new place to store and distribute our media files with hotlinks. **The solution** So far, the best solutions I've found are an order of magnitude more expensive that what we have now. All podcast CDNs seem to charge by the GB of transfer, or GB on disk, or both. All the regular webhosting I've found turns me away when they hear that 50GB per day number. I've considered getting several VPSes and trying to balance the load, but that would be an awkward solution, and a badly coded one as well if I had to do it myself. Is there anything I'm missing? I'm willing to look into anything."} {"_id": "53588", "title": "Price per hour or per month", "text": "I am trying to select a host for my Node.js applications. They are low-traffic applications - I expect an average of 1-2 hours of use per day, so I am looking for a package where I can pay per use only. But I am confused by the different pricing schemes of price per month vs. price per hour - I am not sure which price takes effect when. For example, DigitalOcean FAQ says that: _\"If you use your server for less than 672 hours during the month, you will be billed for each hour that you used it. If you use your server for more than 672 hours that month, you will be billed at the monthly cost\". But on the other hand, in the next question they say: \"Am I charged while my Droplet is in a powered-off state? - Yes. Your diskspace, CPU, RAM, and IP address are all reserved while your Droplet is powered off\"_. If I understand this correctly, this means that, even when no one enters my website, they charge me as if I do use the site. As another example, AtlanticNet says that: _\"With per-second billing, you only pay for what you use\" ... \"Monthly pricing based on average use.\"_. If I understand this correctly, this means that the only effective price is price per hour, and I pay only per actual use. So, my question is: if my Node.js application is always on and waiting for users, but users use it only 1 hour per day, how much will I pay? Will I pay for an entire month, or only for 30 hours?"} {"_id": "7200", "title": "CodeIgniter Hosting", "text": "> **Possible Duplicate:** > How to find web hosting that meets my requirements? Where can I find affordable web hosting for Codeigniter web application? I need hosting that supports all the PHP extensions that Codeigniter requires? Anyone recommend a dependable company?"} {"_id": "6815", "title": "Things to consider while choosing a web hosting service?", "text": "> **Possible Duplicate:** > How to find web hosting that meets my requirements? Also does the country matter?"} {"_id": "12391", "title": "Hosting provider for both ASP.NET MVC 3 and MongoDB?", "text": "> **Possible Duplicate:** > How to find web hosting that meets my requirements? Are there any hosting providers out there where you can host an ASP.NET MVC 3 Web application that uses MongoDB? I know about MongoHQ but I'd really prefer to host the code + the DB at the same provider."} {"_id": "12576", "title": "Best and Economic Domain Purchase and hosting for asp.net app", "text": "> **Possible Duplicate:** > How to find web hosting that meets my requirements? Hi Guys I have created an asp.net application and now i want to publish it on the internet and want to place different ads like Google Adsense etc on it. I don't have a Domain name. So please suggest me a hosting service which is technically better (Providees easy to manage control panel for your site etc) and cheap."} {"_id": "12889", "title": "Hosting suggestions instead of Google App Engine", "text": "> **Possible Duplicate:** > How to find web hosting that meets my requirements? The problem is the following: I made an EJB+JPA+GWT webpage which uses GWT for the visual layer, EJBs for the service layer and JPA for interacting with the database layer. I was wondering if there was a way to host my website in the Google App Engine (GAE) but then I learned that GAE doesn't support EJB\u00b4s, so I was wondering if anyone knew of a cheap solution for hosting a web page that uses GWT+GAE+JPA."} {"_id": "19416", "title": "What's the cheapest non-shared webhost out there?", "text": "> **Possible Duplicate:** > How to find web hosting that meets my requirements? IT has to be Linux hosting, let me set up whatever I want (node, mongo, etc)"} {"_id": "7815", "title": "Which is the best ASP.NET Web Hosting in France?", "text": "> **Possible Duplicate:** > How to find web hosting that meets my requirements? Can you give me a list of the best ASP.NET Web hosting companies situated in France? Thanks in advance."} {"_id": "25675", "title": "Public website to host a yellow-page like project", "text": "> **Possible Duplicate:** > How to find web hosting that meets my requirements? I am working on a community project which is website served as a yellow page of shared artworks among students. At first, I planned to do the website myself (say, Drupal), but worried about the maintenance effort, bandwidth, security, scalability, etc. So I am looking for a public service (e.g. tumblr or wordpress.com) ... The project: * each student post the artwork (one preview image and one attachment) somewhere on the net * then the student loggin in the yellow page website to: * write a title and description of the artwork * upload the preview image * write the link to the download artwork * any students can download the artwork, the website records the download count * any students can vote a 5-star rating, the website records the rating count and calculate the average * any students can view artworks via the web, sort by newest, rating or download count Is there an existing website which can provide such a service?"} {"_id": "30523", "title": "Where can I get a cheap database, no web hosting needed", "text": "> **Possible Duplicate:** > How to find web hosting that meets my requirements? I'm building an application which requires a fairly small online MySQL database. I don't need any web hosting. What are some cheap options for an online database? * ** _Edit(a bit more about what I'll be using it for)_** _The database itself is very small it contains market statistics for 5 weeks of time. Once a week the data will be updated, so that it always contains the most recent 5 weeks._ _Then I will use the data in that to create an XML file which is generated with PHP. The XML file will need to be accessed hundreds-thousands of times per month._ *"} {"_id": "59231", "title": "Searching a FREE WEB HOST which has a PHP MEMORY LIMIT over 196MB?", "text": "I tried to install OpenAtrium on neq3hosting,2freehosting and hostinger. They all have a 128MB PHP memory limit, which cannot be increased through htaccess/settings.php/or similar methods (also cannnot access php.ini). Do you know a free we host which has a php memory limit start from 196MB to install OpenAtrium, Drupal (and other modules)?"} {"_id": "55997", "title": "which vps hosting providers support custom images?", "text": "Which vps/cloud hopsting providers support custom machine images (like those offered by Bitnami and Turnkey linux)? I know amazon does, and unfortunately digitalocean doesn't. Which other large cloud providers support custom images?"} {"_id": "6155", "title": "Joomla hosting with PHP 5.3 +", "text": "> **Possible Duplicate:** > How to find web hosting that meets my requirements? Need a good Joomla host with PHP 5.3 + required. Any good suggestions?"} {"_id": "55792", "title": "Configuration suggestions for a server hosting a general purpose website", "text": "I hope I am asking this question on correct stackexchange website. Actually I have a general purpose ASP.NET website like stackexchange.com QnA sites. It is a new website, and I don't know what amount resources it will use. We can't afford high-end server. Currently we have following server configuration: * VPS with Windows Server * Single Core Processor * 1GB RAM * 70GB HDD * 700 GB BW What I want to know is that is this configuration good for my website type? I mean for 1 year atleast. Since we don't have much budget currently. If not then what you prefer?"} {"_id": "53829", "title": "Are there any webhosts that do not require an IP address for direct database access?", "text": "I am trying to create a little program with Java/JDBC and MySQL that updates a database on my website periodically. However, this is for work and I am having problems remotely connecting via JDBC to the MySQL database. I am using Bluehost right now and they require that I have to add my IP address to give it permission to access the database. Is there a webhost that doesn't require this? Thanks"} {"_id": "45334", "title": "SSL Hosting Support for site with existing SSL certificate", "text": "I would like to know if there are any hosts that host PHP sites with SSL support for free, if I have an existing SSL certificate for my domain. I checked out many questions here(about 10) but none of them have my specific requirement. All questions are related to whether there are hosting sites that provide free SSL, however most sites do not since they have to pay for the certificate. Here, I have a valid certificate. Heroku offers this custom certificate addon for $20, but I want to know if there are sites that offer it for free. https://addons.heroku.com/ssl [Please do not close this question since the requirement is different]"} {"_id": "54020", "title": "Are there any free web hosts that allow adult content?", "text": "I'm looking for a free web hosting (PHP) provider, that permits adult content. And by adult content, I don't mean videos. At most, photos. And I can keep it non-nude. I just want to display information about pr0n-stars. If I keep it non-nude, but still contain information about pr0n-stars, do you think 000webhost will allow this? Any advice would be appreciated. Thanks !"} {"_id": "29823", "title": "Domain registrar that will allow separate sub domain NS record", "text": "> **Possible Duplicate:** > How to find web hosting that meets my requirements? So I recently found out to my dismay that my domain name registrar does not even support sub domains?! Hence I need to get a new one quick. I am faced with a problem where by my company has just signed up to an (legit) bulk email provider. I want to: * have a domain abc.com which host a website and also has MX records set up to route email as well. * I also want to set up a sub domain news.abc.com and re-point the whole name server for this sub domain to the bulk email provider. A couple of the well known domain registrars do not provide this service. Does anyone have any recommendations of flexible domain companies? Note I am not a sys-admin more a web dev so would prefer not to have to manually edit zone files! Thanks"} {"_id": "5001", "title": "Looking for a reliable and inexpensive dedicated server host", "text": "> **Possible Duplicate:** > How to find web hosting that meets my requirements? I am looking to dump GoDaddy, the company I was with for the last 5 years or so. Yesterday my dedicated server went down for at least 5 hours. I was not impressed with the support I received and at the end they informed me that my hard-drive is probably about to die and that I needed to move. They want me do do it on my own or charge me for their support, even though, as I pointed out to them, that they sold me a lemon and I am only half-year half through my lease term. Instead I am looking for a new provider, that will send less promotional emails and will provide a better support. My current needs are not that great but I do need a dedicated server. Any suggestions?"} {"_id": "22023", "title": "How to compare Shared versus VPS hosting?", "text": "> **Possible Duplicate:** > How to find web hosting that meets my requirements? While shopping around for a new hosting service, I have find that I have no idea how to decide between shared hosting (which I presently use for _all_ my sites) service or go towards virtual (VPS) hosting which are always much more expensive. The real question is **How to determine when shared hosting is no longer an option for a site?** PS: This question covers some similar ground but is too specific for my needs."} {"_id": "10236", "title": "Good .NET4 hosting providers", "text": "> **Possible Duplicate:** > How to find web hosting that meets my requirements? Are there any good .NET4 Hosting providers that could host an MVC.net 4.0 application. IXWebhosting which is what I'm using now, says that they have no plans to move to 4.0. Either virtualized hosting or dedicated depending on the price?"} {"_id": "18497", "title": "Where can i find free webhosting with pdo mysql and curl enabled?", "text": "> **Possible Duplicate:** > What is the best free hosting provider for my site? I am looking for a free web host for a existing .com domain with **pdo mysql and curl** enabled"} {"_id": "9541", "title": "Question about MochaHost.com Hosting Plans", "text": "This is not an advertising, I've just found this website (MochaHost) that offers a great things just for 3$/m like : * 2 LifeTime FREE Domains * UNLIMITED Space and bandwidth * SVN (subversion) support * SSH access * PHP 5, Perl, Python, and Rails I need to know if any of you had taken from them a hosting plans, what do you think about it?"} {"_id": "3524", "title": "ASP.NET, PHP, SQL Server, MySQL hosting provider for developer?", "text": "> **Possible Duplicate:** > How to find web hosting that meets my requirements? I am looking to find a web hosting provider that provides ASP.NET and PHP hosting. The purpose is pretty much just a programming playground for me to develop in and possible show some of my work. As I use ASP.NET and PHP I am looking for a provider that provides hosting for both of these technologies as well as access to MS SQL Server and MySQL. Of course I am on a budget so I really can't afford to pay more than $20/month. I had looked at M6.net however in scanning for reviews I found a good deal of negative feedback. I currently use DownTownHost.com and have a good experience with them, however they do not support ASP.NET. I do not require email hosting though I know most packages include it anyway. Thanks for any suggestions. Some features that are important to me: * ASP.NET 4 hosting * ASP.NET MVC support * PHP * MySQL * SQL Server * URL Rewrite support * multiple sites under one account"} {"_id": "27735", "title": "Shared Hosting Provider", "text": "> **Possible Duplicate:** > How to find web hosting that meets my requirements? I've been with Dreamhost for 5 years but the amount of downtime I have experienced over the last 6 months has been outrageous. As of now (2012) which hosting provider would you recommend? Most of my sites are small to medium readership blogs running WordPress. I've been looking at Inmotion and Hostgator. Reliability is paramount. Thanks"} {"_id": "17848", "title": "a good VPS to grow into (cheap at start, multiple sites, php and python, easily custom-configurable)", "text": "> **Possible Duplicate:** > How to find web hosting that meets my requirements? We develop and host for many clients. I am choosing an upgrade over shared hosts, preferably a VPS. The new site should be able to deploy PHP for multiple sites. It should also support a localhost python web process (no Django/frameworks involded) (I described the details in http://stackoverflow.com/questions/6939561/most-easy-reliable-cheap-way-to- deploy-this-python-workhorse-app-with-a-php-f ). We expect ~1000 hits a day combined (and very modest spikes expected) and consequent bandwidth use of 20gb per month. So, I ask for: * latest versions of PHP, Python on a popular distro, accessible via shell * fair pricing at this traffic level (~10$ per month) * fair pricing for scaling up and good upgrade/scale options * GUI config for multiple sites/ server features * sync via SSH even over an obscure network architecture (I am told we have no \"public IPs\") * * * As a hypothetical question, which provider could help me set up a GIT/Mercurial+TRAC server the most easily? (I know there are things like Bitbucket but this seems a hypothetical challenge that corelates well) I heard good things about WebFaction and Pyrox. How will they suit my above requirements? I dont want to steer the conversation prematurely. * * * I will start off from WebFaction or VPSlink which have been doing it for years, and the economics are firmly on that side and also easier for a newb. But Linode is what I choose a great upgrade for a serious site."} {"_id": "67933", "title": "Hosting for image heavy wordpress website?", "text": "What type hosting should I choose for a image heavy wordpress website ? I have to host about 30000 images approx 1.5 mb each + the thumbnail that will be generated for each image i.e about 80-100 kb each for 360x200 size thumbnail . What should I use ? VPS , DEDICATED , CDN etc ? Also which hosting company do you recommend ?"} {"_id": "56975", "title": "Which service for storage and public hosting of image files", "text": "I have been searching this for hours and every services I found have limitations or cost that do not work for me. My quest is simple. I have hundreds of thousands of image files (time lapse jpg images < 5 MB) which are organised in folders and subfolders. I originally hosted them with HostGator. Unfortunately, they see them as storage only because no file was displayed on a web page for 2 months. So HG is kicking me out. I need to quickly find an alternative service which works like a web server so I can organise the files exactly the same way, upload them via FTP or PHP script, and rsync from HG to move all the first (initially) AND be able to display the image file in a normal image tag in a web page. My local ISP in RSA is too expensive and also limit the hosting space to very little. You know the story for HostGator. SmugMug has unlimited storage but only allows upload via their web pages or LightRoom. Would cloud storage be a good option? Can someone recommend a good and affordable service for this?"} {"_id": "47363", "title": "Is there a site that can host adsense and not blocked in China?", "text": "My adsense account is \"hosted\". I want to host adsense on my own site because blogspot is banned in china. So which eligible host share 1. 100% revenue 2. Can be seen in china. I know youtube and and blogspot is a big no no. So what else?"} {"_id": "22141", "title": "What's a good host to backup my personal data?", "text": "> **Possible Duplicate:** > How to find web hosting that meets my requirements? I'm looking for a host that will allow backups on their servers for personal files such as graphic design work, completed video projects, large files, etc... Is there a good host out there that allows this?"} {"_id": "19630", "title": "looking for cheap, USA-based, shared, ASP.NET 4 hosting", "text": "> **Possible Duplicate:** > How to find web hosting that meets my requirements? Actually I did find a solid company in Poland, where I live, but the problem is I will need USA-based server IP because google gives SEO bonus based on server location and I expect most of the visitors to come from the USA. The webpage in question will be tiny and niche so I don't want to overpay. Also, is there any reliable way of making sure I will get a USA-based host? For instance I just looked at godaddy.com and they say their servers are in Europe - probably they think that's what I want :). REGARDINDG IP LOCATION INFLUENCE ON GOOGLE RANK - my sources: http://www.youtube.com/watch?v=hXt23AXlJJU http://www.youtube.com/watch?v=keIzr3eWK8I"} {"_id": "14251", "title": "Setting up a personal domain name", "text": "> **Possible Duplicate:** > How to find web hosting that meets my requirements? I'm looking to set up a personal website, but I know very little about web hosting. Could somebody recommend a (not very expensive) host? What should I look for when choosing a host? Also, I'm rather icky about atriyasen.com because people can't make out if I'm Atriya Sen (which I am) or Atri Yasen! Would you recommend atriya-sen.com? atriya_sen.com? Finally, what about other TLDs like .name?"} {"_id": "38316", "title": "Reputable web host in mainland China?", "text": "> **Possible Duplicate:** > How to find web hosting that meets my requirements? We currently have a rather poorly set up Windows 2003 box with little to no support based in Shanghai; with no control panel/mail server. I am told for legal/business reasons the host must be based in the same location as the company for the website; but this could well be misinformation. Are there any well-known, quality hosts in China that offer reliable English- speaking support? We did consider GoDaddy on the west coast of America, but were informed of the risk of the site being shut down without any notice. We don't have any technically-minded contacts out there to advise, and hoping that someone will have some more experience in this department. Thank you."} {"_id": "47948", "title": "Windows hosting - Shared | VPS | Cloud", "text": "I'll begin with stating what I require for a new website project: * ASP.NET (probably 4+) * SQL Server (not fussy, probably 2008+) * IIS 7 * SSL Not a huge list which is nice. However, the behaviour of the site and its will likely impact things. We are expecting to see quite large peaks and troughs, some spontaneous (eg. new ad), but generally predictable increases in traffic around Xmas (large spike on boxing day) and Easter. I've previously only ever used shared hosting (freelance) or dedicated (at a company) so I have never used VPS or Cloud. I have ruled out dedicated due to cost. Shared I am not so sure about, I have looked at shared hosting on Azure but wasn't entirely sure between Shared or Reserved. Further, how do these differ from cloud services? From some articles on the web, it seems that cloud services mean that you definitely get the resources you are paying for, rather than possibly sharing it with other customers on a shared service: > Since each customer gets his or her own virtual server \"instance\", there is > almost no contention for resources since the hypervisor manages these > partitions. http://www.smallnetbuilder.com/cloud/cloud-services- > apps/395-cloud-vs-shared-hosting-whats-the-difference?start=1 Additionally, because there is not one server, if more resources are required then there is the ability to quickly boost numbers and later reduce them when traffic spikes are over. So lastly is VPS, which seems to be similar to shared hosting in terms of hardware (a single server rack) but with the appearance of a dedicated server. However, this introduces maintenance requirements but, certainly with regards to SSL, might be the only way to get what I want (currently no SSL on shared sites, not sure about cloud). Hopefully I've got my understanding of the technologies correct. Can anyone advise which solution would be best for me? While money is obviously tricky as it depends on host, but our starting fee would ideally be no more than \u00a350/month, but certainly to begin with we only need the most simple of hardware requirements. Oh, also we're solely UK based which possibly affects the decision to choose cloud hosting."} {"_id": "19653", "title": "Looking for recommendations for Windows Web hosting companies", "text": "> **Possible Duplicate:** > How to find web hosting that meets my requirements? I've always used Linux web hosting companies such as Dreamhost and Media Temple. I have been asked to host a website that has .asp based files. Does anyone have any recommendations for a Windows web hosting company that include but are not limited to: * reliable * good support * quality reputation * reasonable price"} {"_id": "45335", "title": "I want to build a script CDN for my place of employment", "text": "I write a bunch of scripts (javascript, php, perl) for my work. Right now, people have to get the scripts themselves from our sharepoint, and then install it as either a bookmarklet or a greasemonkey script. What I would like to do is to have a server that I could host all of these scripts on (js only, ofc), and then if I make a change/fix/enhancement to it, they don't need to reinstall it and it will be updated automatically. What are some good choices for this? Bandwidth will not be super excessive, as each script is probably 3-5kb and they are not run 24/7. Could I use things like an EC2 instance to host them? How reliable are VPS providers(lowendbox.com, etc)?"} {"_id": "58945", "title": "Hosting that makes setting up multiple sites easily", "text": "I'm currently running several websites, both for myself and for clients. Most of them are smallish sites but there are 1 or 2 that are have promising futures. My question is, since it seems like the number of websites under my control will increase over time, what is the best way to set up new sites and which is the best hosting solution for that. I would like to have the following \"features\" * different IP for each site * custom nameservers for each site * different control panel logins (cpanel or anything else) for each site, so if I have to sell or rent the site, the buyer/renter only has access to one sites panel only. What would be the best solution for this type of hosting need?"} {"_id": "6310", "title": "Simple free web hosting to place my web page", "text": "> **Possible Duplicate:** > How to find web hosting that meets my requirements? please advise me simple web hosting. Just I want to upload webpage (simple) which I created and I don't need wizards or generators."} {"_id": "29615", "title": "What's a reliable way to find a webhost?", "text": "> **Possible Duplicate:** > How to find web hosting that meets my requirements? I'm starting to come under the impression that just about every competent webmaster is using a vps. I'm not quite there yet, but I still want a decent job done hosting my website. I'm currently using asmallorange.com based on someone's suggestion, but I'm starting to regret the decision. Searching things like \"web hosts\" or \"web host reviews\" yields results on Google, but I have no basis to trust those results. So, is there a reliable, trusted website or article that I can read about webhosts?"} {"_id": "49303", "title": "Cloud hosting with multiple SSL Certificates", "text": "Can anyone suggest a US based cloud hosting option that allows multiple SSL certificates to be installed on a single account? Preferably a single vhost. Was going to use Media Temple's Grid Service but they only allow a single SSL certificate. I'm hoping to find a cloud service that allows a single virtual host to have multiple IP addresses on a single account, with multiple SSL Certificates on that single account."} {"_id": "10571", "title": "Suggest windows webhost provider for following requirements", "text": "> **Possible Duplicate:** > How to find web hosting that meets my requirements? We have a asp.net MVC3 based web app which uses **SQL SERVER 2008** for database. Also, we have a client side desktop application which also uses SQL SERVER 2008. While developing the system, we are able to Sync tables using SQL SERVER **Replication** feature. Now, we want to host our site on a webserver but we are clueless about it. If anyone of you have a similar system working then please suggest a cheap but reliable webhost which supports Replication. Initially there will be approximately **10 or less clients** who will perform replication **2 or 3 times** a day. The size of the database will be less than 4GB for sure."} {"_id": "18671", "title": "Has anyone used WebFaction hosting for large traffic websites?", "text": "> **Possible Duplicate:** > How to find web hosting that meets my requirements? I'm a point where I'm needing to move a large traffic website off of MediaTemple Grid Server (it's maxing out the resources there now and they have actually asked politely for it to be moved to something more suitable) and on to a new server that can handle things better. My first thought was setting up a VPS. The site has two main components, the website itself and a forum. The website is currently coded in PHP with MySQL and is mostly static HTML files dotted with PHP scripts. It's old and cranky and ideally I want to move it to something like Django. The forum is a Simple Machines forum written in PHP and uses MySQL. They both get a lot of traffic. Bandwidth is up in the 600mb to 1gb status a month with over 500,000 visitors and more a month. The forum gets 400+ posts a day with 40,000 visitors a day. Like I say, I was going to move each part to their own VPS. One for each. But then I was thinking... I'm having to set this up all my self, I need to get backups sorted, I'll need to maintain the website myself. It's all alot of work, which I'm not sure I want to take on. So... Then I thought of WebFaction. I've used them for a few small websites. I've contacted them to ask this exact same question, and they indeed said it could handle it. But they would surely say this. I'm curious to know if anyone here has used WebFaction for large scale websites. Stuff with a lot of traffic. Can WebFaction handle large scale websites in your experience?"} {"_id": "19636", "title": "Web Hosting for a small company with a few sites", "text": "> **Possible Duplicate:** > How to find web hosting that meets my requirements? I'm the web developer at a small company based in the UK and we are looking into getting web hosting. I have been looking around for good deals, but good deals don't always mean the best hosting provider. So far, I have seen two that 'look' to be good: http://www.justhost.com/ and http://www.fatcow.com/ Does any one know if these are good? If so, which is the one to go for? Also, I want to know if there are any other companies worth considering or any that are known for being excellent that either you use or know of."} {"_id": "47993", "title": "Best server setup for a website that streams videos/audio", "text": "I have a entertainment website that host both videos and audio, these are the 2 problems I'm having: 1. Storage is a issue on my dedicated server, I'm going need more space soon 2. The site buckles when it gets a significant amount of traffic I looked at numerous 3rd party videos/audio hosting solutions and they have very limited storage and are very expensive. Storing files on Amazon is affordable at first but once you start to scale those charges skyrocket. Should I host the website on one server and then use another to just stream the files? How would I even begin to setup something like that? Looking for the most cost effective and most reliable solution."} {"_id": "13713", "title": "Webhost ( linux shared ) which has php imap extension enabled and ports not blocked", "text": "> **Possible Duplicate:** > How to find web hosting that meets my requirements? Ok, so I've had it with godaddy. What I need from you is a recommandation of a web hosting company that has php imap extension enabled and even if it has this extension enabled, the ports won't be blocked (i've run into 1 host with this problem ... i could use php imap function but... connection timed out ). Bassically I want to make use of osTicket and write an app that retreieves emails from google accounts using imap/pop3. I am only askingthis here because I want to be sure that php imap functions are working as it should, not be enabled, but the ports blocked ( which is useless ). PS: i don`t want to rent a whole server computer. I just need a standard hosting account ( maybe with cpanel ). Thank you."} {"_id": "43177", "title": "Running Two Website On EC2 Or Alternative Hosting", "text": "> **Possible Duplicate:** > How to find web hosting that meets my requirements? I've been developing and planning to launch my website using amazon EC2. I'm planning to launch Two website, one after the other but realise that i cant share the EC2 Instance. As everyone know Magento is a CPU monster and i'm planning to run on High-CPU On-Demand Instances, Medium $0.183 per Hour, that cost like $120 per month. I'm based in singapore and only aim to have customer coming from Singapore. So far, amazon speed has been satisfactory perhaps because they have one datacenter here in Singapore. Here are some requirements Two Magento Website (One require SSL in future) Running For BITNAMI LAMP stack as i'm not very familiar with setting up server. I'm planning to have stuff like varnish , externsion by magento, not sure if it would conflict. I was thinking if i could make good use of the CPU power by running it on 2 site instead of just one :\\ Any other alternative ? and whts your suggestion?"} {"_id": "15141", "title": "IIS 7.0 free hosting", "text": "> **Possible Duplicate:** > How to find web hosting that meets my requirements? i'm looking for a a website that offers free hosting on IIS 7.0 or IIS 7.5 other than Somee. Does anyone know such a site? Thanks!"} {"_id": "17588", "title": "Deliver my website all over the world hosting", "text": "> **Possible Duplicate:** > How to find web hosting that meets my requirements? I'm building a website in a lot of languages that should be available all over the world. That means I would like to have low response time especially in Europe, Asia and South America. My website runs PHP scripts, needs database etc. Classical Linux/Apache website. Now I want to ask: What's the best hosting solution and can you recommend me any companies that offer services I need? Thanks a lot!"} {"_id": "10772", "title": "Looking for VPS Hosting for a LAMP Web Application", "text": "> **Possible Duplicate:** > How to find web hosting that meets my requirements? Trying to find a _Managed VPS Hosting Solution_ for a LAMP Web Application. * The more CPU, RAM, and Disk space the better * Don't need a huge amount of bandwidth for now * Would like to be able to easily grow into a stronger server * Have really responsive, dedicated, smart support staff -- our current hosting is just terrible My main problem is that I can't even find a non-biased website out there that does a proper comparison of VPS Hosting providers. Can anybody either suggest a reviews/ranking site or a hosting with proven record? How would _you_ go about finding the best hosting service? Thanks a lot! Ali"} {"_id": "13058", "title": "How can I choose between Linux and Windows hosting?", "text": "> **Possible Duplicate:** > How to find web hosting that meets my requirements? I am a relative beginner when it comes to choosing web servers and hosting plans. I'm about to signup for a hosting plan with GoDaddy. My main requirement is ColdFusion and MySQL. The plans on offer include Linux and Windows based plans. Which one should I choose, and why? I don't have a lot of requirements other than what I mentioned above. I never used Linux before but I doubt I'll ever need to do anything beyond tampering with my account. What are the main advantages of one over the other?"} {"_id": "12740", "title": "Can anybody recommend a hosting service where security is paramount?", "text": "> **Possible Duplicate:** > How to find web hosting that meets my requirements? Can anybody recommend a hosting service where security is paramount? We are looking for a VPS or dedicated service that offers hardware based firewalls (i.e. Juniper, Sonicwall), a hardened Linux based environment, software based intrusion prevention services (minimum cfs, mod_sec), and no latency issues. C-Panel based environment preferred. The host should provide the initial server- and security setup, including disabling all unnecessary services, we will rebuild all websites."} {"_id": "20086", "title": "MVC3 webhosting common resources for under $5/month", "text": "> **Possible Duplicate:** > How to find web hosting that meets my requirements? I am looking for a very cheap under $5/month MVC3 hosting. I found some offers which seem too good to be true, and others more expensive. Could you please recommend me a company that you like? I found other posts on the internet but nothing very current and I know I can trust this place."} {"_id": "15378", "title": "High quality/performance shared hosting (in northern Europe)", "text": "> **Possible Duplicate:** > How to find web hosting that meets my requirements? I work as a web developer on almost all levels. However, my typical customer is a 1-5 guys running some sort of consulting business. They have (or want) a web page with some kind of CMS so the can perform most (or all) editing themselves. I normally opt for Concrete5 as my default CMS because it's the most user friendly (and free) CMS I have found. My good recurring customers I host on my own server as a service, but I need a good host for the customers where I want to deliver a product and not be responsible for whatever may happen in the future. However, I still struggle with hosting! Experience shows that the typical ~1$ shared hosting is waaay to slow to run concrete5 smoothly, and a VPS is out of the question because I don't want to maintain it. So, where can I find as fast (from northern Europe), reliable, shared host where I can put a site and don't have to worry about the server going down or being unmaintained. I expect this should cost around $10-$20 but I'm open to all kinds of suggestions because different customers have different budgets."} {"_id": "43924", "title": "Multiple websites on a VPS", "text": "I am going to create my sites network. I have to create at least four site: e.g. myblog.com, myfreebies.com, mystore.com, my2ndstore.com. I do not want to buy one host for every one of them, just going to manage all of them on host. During searching I found VPS are good for my purposre, but I have problem in choosing good configurations. At the moment I am going to buy a vps with 1 core CPU, 256M RAM, 20G Hard, Unlimited Bandwidth. Is it a good configuration? Can mentioned VPS serve high numbers of visitors? Is it good for page rank (as you know, higher speed sites = higher Page rank)? Any other suggestions are appreciated."} {"_id": "12897", "title": "Hosting services for JEE", "text": "> **Possible Duplicate:** > How to find web hosting that meets my requirements? I'm using EJBS, JPA and GWT. I was wondering if someone knows a hosting service that offers what I need (like a Glassfish or JBoss server). I need a servlet and an EJB container."} {"_id": "3972", "title": "Cheap Hosting Provider for Business Splash Page?", "text": "> **Possible Duplicate:** > How to find web hosting that meets my requirements? I'm looking for a provider to host a splash page for my business. One page (index.htm) with some basic HTML (company name, logo, contact details), and a CSS stylesheet. That's it. It's mainly for reference (to complement business cards). I also don't have a domain purchased yet (but the name is available) What's the cheapest provider for this scenario? (if they also offer the DNS, bonus - two birds one stone). As it's only one page, and no dynamic content - i obviously don't need a dedicated server, a VPS/shared server would be fine. Cheapest/most well known seems to be GoDaddy. Can anyone name/recommend a few more?"} {"_id": "17810", "title": "Excellent WordPress unix hosts with SSH access?", "text": "> **Possible Duplicate:** > How to find web hosting that meets my requirements? My boss is sick of DreamHost's excuses. We have a popular WP site on a Unix VPS with DreamHost. We'd like to find a new, more reliable host, who can handle ~~a high-ish~~ traffic of perhaps 2000 uniques a day WP site and offers SSH access. Anyone have a recommendation?"} {"_id": "20452", "title": "Is there any free host which supports PHP with curl enabled?", "text": "> **Possible Duplicate:** > How to find web hosting that meets my requirements? I'm looking for a web host which supports PHP with CURL extension enabled. I tried a few of them but they keep showing me errors."} {"_id": "68689", "title": "Which Cloud Region Should I select for my blog site?", "text": "I currently have my blog set up with shared hosting(godaddy). After recent growth on traffic, I feel like, its time to get serious by hosting it to a better place, I am thinking about cloud obviously. However, in amazon/digitalocean etc, I have to select the region where I want to host the site. As part of shared hosting it was hosted in US region, I guess. But as my site's traffic mostly based on Asia, I am thinking to choose a nearby location now, say, Singapore. Though I know, its probably make sense to do so. But at the same time, I also don't want my North american visitors suffers, rather I want to grow more visitors in this area(I myself currently reside in Canada). Just wondering if the decision moving the server to singapore region can be very bad for north american visitors or it will be kind of OK for now? Should I stick to a US based server for any reason? For User Experience/SEO perspective?"} {"_id": "16119", "title": "I need a webpage to host my javascript!", "text": "> **Possible Duplicate:** > How to find web hosting that meets my requirements? Does anyone know a website that hosts javascripts on their page? I have a research project that needs to collect some RTT from all over the world and compare them together. I have written the javascript code for that but I do not have a high hit rate website to put it on to collect data. I know it is a little bit odd question to ask, but do you know any website or any trick that can help me? Note that the script would not do any harm to anybody! :-) Thanks, * * * Decad is right, I basically need some people to put my script on their \"high- hit rate\" website ... so I can collect data from large number of clients... Of coarse, the script is run on the background with no harm to the page. It basically measures some RTT and submit it to a server. I already have some pages, but they barely got a hit from outside! Thanks,"} {"_id": "10200", "title": "JSF Glassfish host", "text": "> **Possible Duplicate:** > How to find web hosting that meets my requirements? Is there any Glassfish and JSF 2.0 hosting for free or reaonable prices? I mean something which is good for personal projects."} {"_id": "20782", "title": "MongoDB Hosting: MongoLab vs MongoHQ vs MongoMachine", "text": "> **Possible Duplicate:** > How to find web hosting that meets my requirements? I looking for a MongoDB managed hosting solution, here are 3 I found comparable. Has anyone have some insights in which one to use?"} {"_id": "54095", "title": "PHP web app hosting considerations and advice - high traffic expected from day 1", "text": "I'm in the process of researching hosting providers for a high traffic Facebook application. The client has over 300,000 likes so we can expect quite a bit of traffic when posts are made about it and there are a dozen of so celebrities involved in promoting it on their end. The app itself, in staging at the moment, is built with Codeigniter and is running a MySQL database. Only one table, of two, is expected to grow significantly through user interaction with the app. Most of the traffic will be coming from the UK. Overall page sizes with all assets loaded will be less than 1mb. For other similar applications we have used a Cloud VPS with the following configuration: Monthly Bandwidth - 1TB RAM - 2GB Storage - 75GB CPU - 2 x Xeon 2.33Ghz Any advice would be great appreciated!"} {"_id": "46884", "title": "looking for a comparable service like one.com, but need more domain parkings? any ideas?", "text": "I have a simple and small website, webshop, email and mobile version. one.com would be OK for this, but I want to store some more domains which one.com won't allow. Thanks a lot people! rikolino"} {"_id": "47836", "title": "hosting images on image hosting websites", "text": "my website traffic has increased a lot and now in some cases has difficulties handling these users specially on busy hours. I want save some of the images on the website on image hostings so that they load from there just like the jquery that we load from code.jquery.com or code.google.com. I wanted to know is it wise to do so? do you know any good websites with permanent image hosting? any suggestions?"} {"_id": "9170", "title": "Looking for Windows shared web hosting with PHP support", "text": "> **Possible Duplicate:** > How to find web hosting that meets my requirements? I'm looking for Windows based shared web hosting which supports multiple hosted web sites (multiple domains). Supported technologies should contain: * ASP.NET 4, ASP.NET MVC * IIS 7 * MS SQL 2008 * PHP, MySQL It is for my hobby projects so it should not be too expensive. I tried GoDaddy's Windows Deluxe hosting but the experience is very bad and I want to move elsewhere. WordPress hosted on GoDaddy's Windows hosting is unloaded every few minutes and next request takes around 20s to complete. Following request to empty site takes around 3s to complete. Even request for RSS which transfers 1.2KB takes several seconds. The delay happens in PHP processing because static content is served within 200ms. It helped to migrate to Linux hosting (all requests are served under 1s) but Linux hosting is not what I'm looking for."} {"_id": "57417", "title": "Looking for reliable, long term, free hosting", "text": "My question: I have a site powered by Wordpress that I need to host, and it has to be done for free because it is for an under privileged school. Because my favorite site, 1freehosting, isn't working for me right now, can you recommend some free hosting that is reliable and long term? The background: It is just another Wordpress site, for a journalism club. I offered to try to find free hosting, and find some paid ones as well. I already have a good list of paid hosters, but I need free. I want a server powered by Apache, reasonable disk space that is enough for Wordpress content and pictures, about 5-10GB in bandwidth every month, and it must be really reliable. Please help me!"} {"_id": "23806", "title": "Free cloud hosting for asp.net facebook apps?", "text": "> **Possible Duplicate:** > How to find web hosting that meets my requirements? Facebook offers free app-hosting for PHP, Ruby, Node.js and Python, see https://developers.facebook.com/blog/post/558/ . Does anyone know whether there is something similar for asp.net?"} {"_id": "16564", "title": "How to put my web site online", "text": "> **Possible Duplicate:** > How to find web hosting that meets my requirements? I've developed a web site using symfony and wamp server on my pc and now I'd like to make it accessible to everyone on the web. So what I'd like to know is the best host provider and domain register considering that it's a symfony project. It's my first time launching a web site so I don't really know if I can have `ssh` access to the server host considering that it would be better than only dumping through ftp. Subversion is also important to quick updates."} {"_id": "27644", "title": "Is there any free host which supports php and mySQL in utf-8?", "text": "> **Possible Duplicate:** > How to find web hosting that meets my requirements? Is there any **free host** which supports php and **mySQL queries in utf-8**? I've already tried to use x10hosting and 000webhosting, but they don't support utf8 mysql queries (got mojibake). The default encoding of mysql in both sites is latin-1, and you're not able to change that. Is there any other **free host that fully supports utf-8**?"} {"_id": "18678", "title": "Please recommend some good hosting for Facebook App (free/paid)", "text": "> **Possible Duplicate:** > How to find web hosting that meets my requirements? I am learning to program facebook app and so I am looking for some free/trial/paid hosting to host my php and db."} {"_id": "17142", "title": "hosting website with video and audio conversion", "text": "> **Possible Duplicate:** > How to find web hosting that meets my requirements? I have created a website with audio and video conversion online and providing the output file via email or instant download. I want to know which kind of hosting. server will work fine in my scenario. The video upload size is limited to 50mb per video and audio size is limited by 15Mb. After every two hours the completed video will be deleted, to ensure storage capacity. Please let me know web host server, which can be used for this purpose."} {"_id": "53866", "title": "What is the difference between cloud hosting, web hosting, and VPS (virtual private server)", "text": "I am building a simple android App that will connect to a online database. But I am not sure what I need in terms of hosting. Would it be cloud hosting, web hosting, VPS (virtual private server) I am looking for a free one where I can create a REST web-service and a MySQL database. There will be no website. There is just for testing my App, so I don't need a lot of space or high bandwidth. Basically, I just need something simple and free that works. But don't know what I am looking for. Many thanks for any suggestions,"} {"_id": "64672", "title": "Free or Low Cost Web Hosting for Small Website", "text": "I have a small website (between 2000 and 10000) page-views a day. I'm looking for a free or low cost web host. I tried 50webs.com but their server breaks down. So as not to cause debate, I am also just looking for links to good information sources for web hosting if just finding a good web host is too general. I currently only use HTML, CSS, and JavaScript though I'm considering learning PhP and other more advanced languages to step up my game."} {"_id": "16856", "title": "Looking for a webhost that offers both Linux and Windows", "text": "> **Possible Duplicate:** > How to find web hosting that meets my requirements? To deploy my .NET web-application I need both Windows (for frontend) and Linux (for database) virtual servers (VPS). What Windows+Linux hoster would you recommend? Of course, I am not interested in Windows or Linux-specific hosters. Also I found some mix hosters who provides low prices for Linux but high for Windows or vice versa."} {"_id": "23521", "title": "web hosting for ffmpeg.exe", "text": "> **Possible Duplicate:** > How to find web hosting that meets my requirements? We're searching for a web hosting that support ffmpeg.exe. Can anyone gave us a reliable web hosting that supports ffmpeg.exe?"} {"_id": "42528", "title": "LAMP in the cloud", "text": "> **Possible Duplicate:** > How to find web hosting that meets my requirements? For my job, I have to use lots of LAMP-based systems like Wordpress. Is there a cloud service out there that can simply give me virtual LAMP servers on- demand so that I can, for example: Download the complete filesystem of a Wordpress site, \"spin up\" a LAMP instance solely for developing that Wordpress site, upload the filesystem, develop the site, and then when I'm done developing, I just take a new snapshot of that filesystem for delivery to my client and kill the LAMP instance. Basically I'm just asking for a pool of sandboxes in which to develop Wordpress sites without having to set up any LAMP stuff."} {"_id": "47986", "title": "What is the web server that suits my need?", "text": "I have a website which has around 500 visits per day, 10000 per month and it is powered by Joomla. It is a website where user can create new entries and can also look up entries which are saved in the database. So, it communicates a lot with the database. The website is very slow now and we plan to move it to a better server. I really need your help to know what the server specs to look at and how much would be sufficient."} {"_id": "28320", "title": "Webhosting with custom database choice", "text": "> **Possible Duplicate:** > How to find web hosting that meets my requirements? I am trying to find somewhere to host a website which uses OrientDB as its database. My budget doesn't stretch to a dedicated server where I can configure everything as I need it. Rather, I am hoping to find somewhere, ideally UK based, that will allow me to install/install for me OrientDB on their server, that is of the normal shared server variety. Is anybody able to point me in a good direction for this please (whilst UK is preferable it is not essential)?"} {"_id": "25124", "title": "A European Mail Hosting Provider", "text": "> **Possible Duplicate:** > How to find web hosting that meets my requirements? What are reliable e-mail Hosting Providers in Europe (Laws). It's important that they exist already for some time (no Newcomers), or are used by bigger companies. One Provider I know is SwissMail, but I would like to have some diversity. I don't need the cheapest, but it would be good if you can suggest providers, which you have experience with."} {"_id": "29384", "title": "Finding a good CentOS VPS host", "text": "> **Possible Duplicate:** > How to find web hosting that meets my requirements? I am looking to host a new Wordpress blog/portfolio on a CentOS VPS. I know that a shared hosting account would do for such requirements. However I want to run NGINX with an opcode and memory cache, as I am a bit of speed queen. The problem is I don't know who to buy from, and am wary that just because the numbers look good doesn't mean the service is good. I am happy to stay unmanaged for the time being, and am in two minds whether or not to use cPanel (is pretty expensive, plus I plan on using Centmin Mod to ensure things like PHP, NGINX .etc stay up to date). I was hoping you could provide suggestions of quality hosting providers who are affordable (I don't want to spend much more than $20 a month for the time being, but want an easily upgradeable system should I ever need more resources). **In short I require:** * $20 max budget (small room for manoeuvre) * CentOS (preferably 5.* and 64bit) * Speed lots of it (I am no expert on the balance of resources, so need a little guidance here) * an upgradeable plan should the time arise * A good experience :) Thanks."} {"_id": "10711", "title": "Web Hosting Checklist", "text": "> **Possible Duplicate:** > How to find web hosting that meets my requirements? I am a web developer that is starting to look into hosting his own website. I would like to showcase my programming skills (PHP, MySQl, C#, Wordpress). My knowledge of languages I am OK with but the actually hosting site is where my knowledge starts to get a little shaky. I know the basics (bandwidth, sub- domains, re-write rules) but I would love your input, to help me formulate a check list of certain web-hosting services that I should be on the look-out for. Also I was wondering if there were any reliable hosting providers who give you the option to host both c# code-behinds and PHP code. As I would like to have two versions of my site, one in C# and one in PHP the hope is that if I need to look for another job this website will help me show possible employers my server side knowledge. I hope this is enough info, I did some researching online but found a bunch of unless articles and I've always have had luck on the StackExchange sites. So hopefully you, can help me. Thanks alot."} {"_id": "7606", "title": "Any High Availability CF 8/9 Ent VPS / VDS hosting out there?", "text": "> **Possible Duplicate:** > How to find web hosting that meets my requirements? Will apreciate it if you could list some that you may know of. Those that I could find are standard. I fould one that is enterprise, but no high availability."} {"_id": "14802", "title": "Web Hosting Advice for Project", "text": "> **Possible Duplicate:** > How to find web hosting that meets my requirements? I am working on a project that will be released as open source in the latter part of the year. I am starting to think about how the accompanying website will be hosted and would greatly appreciate some advice. **Requirements:** _Domain #1_ * Information about the project itself (just pages and pictures). * Documentation / Wiki * Forums * Download of project source (approx 3MB archive) * Download of various themes and community contributed content (est. sizes 10KB ~ 512KB). _Domain #2_ * Primary company website that offers products and services. This will be primarily pictures and pages. What kind of web hosting would be best for a project like this. I am working on a very tight budget and can only afford to spend up to \u00a3250 per year for hosting this. I was considering using some sort of VPS hosting. I found the following companies which seem to offer around this price range, but they have very mixed reviews. * http://www.webhosting.uk.com/ * http://www.eukhost.com/ * Godaddy UK * uk2 . net My company is based in the UK, how important is it for me to use UK based hosting? There are plenty of overseas hosting companies that are considerably cheaper. When it comes to bandwidth, how many downloads will **bandwidth: 100GB** get me? Any advice would be very greatly appreciated!"} {"_id": "10980", "title": "Cheap ASP.NET Hosting - Mutiple Domains", "text": "> **Possible Duplicate:** > How to find web hosting that meets my requirements? Can anybody recommend some quality ASP.NET hosting providers that allow you to multiple domains without making you use a \"reseller\" account or purchase multiple accounts. I really only need email accounts for one of the domains. I'm looking for something about $20 to $45 USD."} {"_id": "52302", "title": "Website hosting", "text": "First off I apologize if this is the wrong place to ask this question, if it is please direct me to the correct place. Ok with that out of the way I have a question about where to host my website. I am a fairly experienced web developer who hosts my stuff on a github.io page. This has worked for me but I want to try and get into hosting my own server that I control. Can you Webmasters please give me an idea of what website I could use that would allow me to have a custom domain and have a server that would run basic things like php, mysql and server side scripts. I have read this but it did not tell me what services would provide the best web hosting for my needs. **Other Details:** My site will probaly only get 10-20 visits a day so I'm not looking for heavy traffic web hosting PHP supported needed mysql support would be nice but, I can live without it I would like to only be spending around $15 a month I would need a custom domain I don't want any of that www.myname.bu1.4.com/freehosting"} {"_id": "9880", "title": "Looking for Windows Hosting Reseller provider with decent reputation", "text": "> **Possible Duplicate:** > How to find web hosting that meets my requirements? I'm looking for a windows web hosting reseller provider with the following requirements: * Plesk control panel (for reseller and customers) * Support for Magento, Joomla, and Wordpress * A decent reputuation I'd prefer to not go the VPS route because I do not want to support the server."} {"_id": "56412", "title": "How many websites (such as in example) can I run on a Virtual server with 1CPU-core (2.0ghz) / 1Gb Ram?", "text": "I'm at the end of my web hosting contract and I'm thinking of changing my plan. Instead of a 'simple' web hosting plan, I'm thinking of 'buying' (renting) a virtual server at theirs, to host multiple websites on. A single subscription (\u20ac 9,99) will give me a 'cloud box' where I can run one virtual server on. I will have the use of 1 single core and 1.024GB of ram, and enough of bandwidth and data storage. However, I'm really wondering how many websites I can run on such server, with websites such as (this, html/jquery website and Wordpress websites such as these - keeping in mind these websites will attract up to 20 viewers a day, each, at most. Based upon the system resources and the website examples that should run upon such system, how well would the server/websites perform? Let's say I'd run 15 websites on such server, each such as the examples as above, with each up to 15 viewers, each, at most, spread over the day. With a score of 1 (not), 2 (below normal) 3 (reasonable) 4 (more than enough) 5 (excellent) - What score would you give it, to answer the question 'how well would it perform' ? As opinions are not allowed (right?) - Please clarify based upon facts."} {"_id": "6829", "title": "Hosting solutions for 10,000 daily users, asp .net 4, sql server 2008 R2?", "text": "> **Possible Duplicate:** > How to find web hosting that meets my requirements? I've run a custom built social network on top of asp .net mvc & sql server 2005 for the last 2 years on discountasp .net I've been satisfied with the value as it's cost me around $500 over that period. The limitations are 1GB storage which meant the only media on the network I could afford was 1 profile photo per member and 1GB database, which means I have to periodically delete all my logs and I'm left with about 600MB sized sql database. The social network has been linearly growing consistently since inception and this solution will not hold of for 2 much longer. I also wish to build a whole new much more involved site, richer in media and many more options for user interaction, meaning a database schema double the size and much higher db traffic. I would like to find a cost effective (like $300 per month or less) solution to host the new site which would have around 10,000 daily visitors. Perhaps I might need a 10GB sql server database, which much be 2008 R2 because I've got cool spacial stuff going on. Also I'm wondering if I can put my media on a separate cheap hosting solution where I don't have to pay through the roof for storage and bandwidth and implement it so users to seemlinessly switch servers (and hosts) while they use the application. Has anyone ever done this? If so which host did you use for your media? Any help greatly appreciated!"} {"_id": "22585", "title": "How / Where can I host my Java web application?", "text": "> **Possible Duplicate:** > How to find web hosting that meets my requirements? Edit: Just to be clear. I want a place where I can do $>java myapp, where myapp uses plain old ServerSocket. No jsp, no servlet, just very basic stuff -- may as well be a chargen server. In case you want to skip to the crux of my inquiry just read the **bold** type. I just finished my CS degree (at 39 years old :-)). For my final project I designed and built a system that can provide local positioning / location awareness to mobile wifi devices (only have Android client thus far). The server receives data from clients, processes it, and responds to the clients with a messages containing information about their respective locations. I would like to continue the project (perhaps release as open source but that is a different discussion). Thus far my server application has been running on the CS department's hardware where I could pretty much do whatever I wanted. I'm getting kicked off that system in a few weeks so I have to find a new home for my server application. **I need a host that will let me run my Java server (along w/ mySQL db) -- preferably on the cheap since I haven't yet got a job. I have very little experience with the \"real world\" of web development / hosting. I'm having trouble figuring out what kind of hosting service will let me run my application as is. If that turns out to be a tall order then I need to know what my options are for changing thing so that I can get up and running with some hosting.** As an aside, I'm also researching whether or not I should rewrite this in a different language. Trying to figure out if there is a substantially better (for whatever reason) one for what I'm doing. This might also potentially have a bearing on my hosting needs. One possibility is to write the server in something more widely accepted by hosting services. I have been searching for answers to my question and haven't found quite what I'm looking for. Part of the problem might be that I don't know exactly what terminology to use. If there is a good answer to this question elsewhere please feel free to point me towards it. Thanks for help / advice."} {"_id": "7525", "title": "Shared hosting with dedicated IP", "text": "> **Possible Duplicate:** > How to find web hosting that meets my requirements? Can you please mention here if you know any shared hosting providers who give option to get a dedicated IP? So far I know of one - Netfirms. Please list others if you know. The reason why I am looking for such a thing is: i) In most shared hosting plans, you end up getting better CPU/burst RAM than a VPS provided you don't abuse. ii) Dedicated IP is good for SEO. For example, many times, you may get up getting an IP where some p*** sites are also hosted in shared hosting."} {"_id": "57486", "title": "SaaS in south africa", "text": "I am looking for a place to host a mvc .net website. The primary users will be in South Africa. As far as I know azure and appharbor has no servers in SA and I do not know if the latency will be an issue, so my question is if any of you guys have any thoughts on this. My alternative is some local VM hosting, but would prefer a SaaS route?"} {"_id": "11182", "title": "Comparison of web hosting providers", "text": "> **Possible Duplicate:** > How to find web hosting that meets my requirements? My current web host that I've been with for over a decade is jerking me around, and I'm ready to find a new provider. Holy smokes! There are so many! It's overwhelming! I want to see a comparison of the most popular hosting providers. I want to compare things like: * Bandwidth allotment * Storage limit * Databases * MySQL * SQL Server * # of Free Domains * # of Sites etc. Better would be a tool that let's me supply parameters that will narrow down the list. Ie, only windows with unlimited storage, unlimited bandwidth, free sql server databases and less than $5.00 month. I've found a couple of sites that do something like this, but not very well. Is there something like what I'm looking for already on the web? I don't want to go to dozens of different hosting provider sites and jot down all the stats manually."} {"_id": "7660", "title": "What Sharepoint Server Hosting provider would you suggest?", "text": "> **Possible Duplicate:** > How to find web hosting that meets my requirements? We run an international organisation that is looking at implementing MS SharePoint. Who would you suggest we use? Currently we are looking at: Verio, Rackspace, SherWeb. In particular we're hoping for a reliable provider that has better international connectivity. Server uptime and service responsiveness are our main key requirements. Greg"} {"_id": "61463", "title": "Is there any kind of relation between the activity in a website and the hosting that I have to purchase?", "text": "Is there any kind of relation between the activity in a website and the server that I have to purchase? I need to give a budget, for website that could increase the traffic in the future, so I would need to know what type of server should I purchase."} {"_id": "921", "title": "Virtual Private Server vs. Dedicated Hosting", "text": "> **Possible Duplicate:** > How to find web hosting that meets my requirements? What is the difference between virtual private server and dedicated hosting for my website?"} {"_id": "26135", "title": "Where to get a shared hosting with MySQL 5.5 or higher?", "text": "> **Possible Duplicate:** > How to find web hosting that meets my requirements? I need to utilize the features of 5.5 and my current host (liquid web) doesn't provide 5.5 for their shared plans. I do not want to get a dedicated server right now, but will in the future. I just need shared hosting at the moment that supports 5.5"} {"_id": "20570", "title": "Your advice on Cloud Hosting", "text": "> **Possible Duplicate:** > How to find web hosting that meets my requirements? I would like run my website (.NET 4 + MS SQL 2008) on the Cloud. What I need is: 1. Managed Service for OS update and Security Paths. 2. DataBase MS SQL 2008 (ok even if shared with other website). 3. Reliable Technical support. I looked at MS Azure, the service in quite flexible but their website does not provide cost and information about the Support (I also have no experience on deployment on that platform), I also evaluating MaximumAspESP (less scalable) but providing points above. I would like ask you your advice on a serious Hosting Company (for cloud service)."} {"_id": "15454", "title": "Free asp.net hosting for my college project", "text": "> **Possible Duplicate:** > How to find web hosting that meets my requirements? I am a student developing an asp.net website for my college project. I'd like to put it online for everyone to see. Are there any webhosts who allow me to host my web site for no cost?"} {"_id": "10867", "title": "I need a quality linux UK based webhost", "text": "> **Possible Duplicate:** > How to find web hosting that meets my requirements? Can any suggest a good (ideally) UK based reseller web host. I don't want the cheapest solution. I want a robust, quality one. Needs to run php5 and MySQL. I have been considering a managed dedicated server instead of a reseller account so if anyone has good experiences please let me know."} {"_id": "23819", "title": "anyone know a good cheap asp.net mvc3 webhost?", "text": "> **Possible Duplicate:** > How to find web hosting that meets my requirements? I'm looking for an asp.net webhost to use, not looking for anything particularly fancy, just something to test some stuff. It will need to support sql databses..."} {"_id": "4882", "title": "High Traffic Web Host Solution?", "text": "> **Possible Duplicate:** > How to find web hosting that meets my requirements? I'm currently shopping around for a web host for our website we are hoping to release in the near future. This is my first real step into this area. Just wondering what I should be looking for. It is an ASP.net MVC website with an MS SQL Server backend. I need to know that the server will not buckle if the traffic booms. Currently I'm looking at a managed dedicated server from Singlehop."} {"_id": "34722", "title": "How to Determine VPS Hosting Resources Needs for my upcoming Wordpress blog? How much resources should i purchase?", "text": "> **Possible Duplicate:** > How to find web hosting that meets my requirements? Decided to purchase VPS hosting but Getting confused on amount of Resources i need? Wordpress will be used as platform, The blog i want to setup is assumed to have a traffic between 20k - 25k Visits per day with a rate of 5 pageviews per visit... there is No Download Facility provided...the content of the blog will be Text, Images & videos (will be used rarely)... The main question is? For the above requirement: How much RAM will be enough? How much CPU usage i will need? How much Bandwidth will be enough? How much Disk Space? Any other Requirement? Thanx in Advance.."} {"_id": "18892", "title": "Hosting a website with Rails, Groovy or Java servlets - shared hosting or VPS?", "text": "> **Possible Duplicate:** > How to find web hosting that meets my requirements? Are there hosting companies that cater for Rails, Groovy and Java servlets? Or is my only choice to use a VPS? If so, how do I publish my website on the hosting?"} {"_id": "50401", "title": "Recommended linux hosting with private whois", "text": "_This question is not a duplicate ofthis one, which I already read till the end._ I am not asking for the best hosting solution, because this would invite moderators to close my question. I just want you to help me choose one hosting with this requirements: * Free private WHOIS _(plus if I don't have to provide my name and ID, specially ID)_ * MasterCard payment method * Basic Plan _(I want it for a blog)_ Note: The not-provide ID is for privacy reasons. I do not want to be anonymous, but neither giving my ID, adress, name to an unkown."} {"_id": "3857", "title": "Frustrated trying to find a hosting company for my .Net site ?", "text": "> **Possible Duplicate:** > How to find web hosting that meets my requirements? I have spent weeks trying to find information on hosting sites for my site. I currently use 1&1 but I need to have support for add-on dll's ( iTextsharp ) and they have told me for security reasons they cant do that. I use .net , c# and MSSQL.. I'm in the UK but not bothered where I host So off I went researching on the web and everytime I thought I had found a good one I would read reviews and they would be bad ! I am down to the following.. does anyone have any viewson them or point me to a GOOD site which has proper reviews from lots of users or EVEN BETTER a reliable .Net host which would support my needs:- DiscountASP, U2-Web, 000webHost, HeartInternet, Somee, Arvixe, Daily.co.uk, GoldPuma, TitanInternet, Aspnethosting, IXWebHosting Thanks Peter"} {"_id": "2344", "title": "Investigating and finding a web host", "text": "> **Possible Duplicate:** > How to find web hosting that meets my requirements? I am looking for a website host and have been searching the web for similar websites(similar to my website) and trying to see what hosting company they are using. I keep looking up details of websites I like on netcraft.com, and I keep finding 'n' number of Netblock owners with Random or generic names. Apart from a few popular ones(eg. Planet.com, MT), I find it extremely difficult to find out who the hosting company is. For eg. Netcraft Site report for a website www.randomwebsite.com gives Netblock owner as RandomWebHoster.com, if I go to RandomWebHoster.com the site does not exist. The DNS is given as ns1.superrandomdns.com, the website superrandomdns.com simply has a page saying superrandomdns.com. I am looking for a reliable web host with **J2EE support** for a accounting webapp. How do I investigate stuff myself if I want to find a good host? Where did you begin when looking for a webhost? (Please do not down vote the question. My goal is to make host evaluation easier.) I understand that different type of websites have different needs, eg. a low traffic blog v/s a mission critical webapp."} {"_id": "22855", "title": "Good Place for File/Backup Dedicated Server?", "text": "> **Possible Duplicate:** > How to find web hosting that meets my requirements? I'm looking for a cheap dedicated server for file hosting / backup only. Basically I just want something with minimal CPU & RAM, but with maximum arrays of hard drives (e.g. 4x 4TB hard drives per server, RAID 10). Anyone know of a good dedicated server provider that can offer this type of deal?"} {"_id": "9308", "title": "Slashdotted web site seeks new home", "text": "> **Possible Duplicate:** > How to find web hosting that meets my requirements? I am maintaining a website that contains mostly simple html (just a little php). Normally the site receives only 4000 hits per month, but it was recently slashdotted by the New York Times (>30,000 visitors and 30 GB in a day) and the web host provider (bluehost) throttled the CPU in response. This slowed down the website considerably. What web host providers would offer a more scalable solution? Ideally I would like a high-quality host that charges by the GB and can handle bandwidth to expand during sudden slashdotting episodes without a reduction in performance."} {"_id": "67467", "title": "website for software distribution: standard hosting or vps hosting?", "text": "I want to create a website on which users can download software, but each time one of the user downloads the software, I want it to be modified: compiled again with some new information, obfuscated, than proposed to download. Do I need a special hosting to do that (vps or dedicated hosting), or would shared hosting do it? Is that even possible on the latter? The website should not have a lot of traffic and the language I would use is ASP.NET."} {"_id": "9422", "title": "place to host a simple php socket server", "text": "> **Possible Duplicate:** > How to find web hosting that meets my requirements? i am running a small project that occasionally requires me to run a php socket script through ssh. it uses a few bytes of bandwidth (just some text) which activates my art installation project. i tried a simple web hosting plan but that didn't support sockets, so know i am using a 32$ vps plan on namecheap.com just to run a simple php script. i don't really need to host anything else on it. i find it kind of excessive for such a simple thing. is there a place i can run my script for a lower cost? any servers that support php sockets and ssh?"} {"_id": "11460", "title": "full environment for development and production web site", "text": "> **Possible Duplicate:** > How to find web hosting that meets my requirements? is there any hosting that gives you full environment for development and production web site including IIS,Sql server, SMTP? Thanks."} {"_id": "66947", "title": "Managing a large dynamic website", "text": "so I just finished making a website and want to ensure that it could support a given amount of users at a time. I built it on a shared hosting account, but the thing is that my web hoster has a maximum of 10 simultaneous connections to a database. So, only 10 users at any given time can make a request to the DB. This is a problem because the site is dynamic and loads data from the database on every load. I was wondering what a sufficient type of server would be to host this site (again no accounts, user picture uploads or anything, just requests from a MySQL database and occasional copying of images from the web). I would like to support at least 5000 users at any given time, maybe more. Thank you."} {"_id": "57282", "title": "Webhosting server Frontcontroller support", "text": "I am currently using XAMPP to code my websites, but i want to host it online now. I came across some hosting servers and they had restrictions, one didn't allow the use of a PDO and another didn't allow me to use require_once(); basic specs that i need are: * Fontcontroller support * PHP & mysql * Apache * free do some of you know a free webhosting very close (same specs) as XAMPP ? (so i basicly just need to change the PDO username, password and dbname.)"} {"_id": "64663", "title": "Web Host which provides Latex and embedded programming", "text": "Hopefully this is a reasonable place to ask this question. I'll confess I'm a little green when it comes to web programming and websites in general (though not programming). I'm a Math and Physics person. I want to make a personal webpage containing a Math and Physics blog. Ideally the blog should support latex, and embedded programs. This would allow me to write, say, an equation for an orbit and then show what the orbit would look like (perhaps letting the reader configure parameters). The programming language can be javascript (though it isn't my favorite language). My budget is around 5 dollars a month. Does anybody have suggestions for a good Shared host with these kind of requirements? And a small aside, It would be useful if I can move the website content, since I might live at a university in the nearish future. They would have servers which could support such a webpage."} {"_id": "8463", "title": "Best cheap Linux hosting with LAMP support", "text": "> **Possible Duplicate:** > How to find web hosting that meets my requirements? I'm looking for a linux hosting provider that supports LAMP and wondering if the community could recommend one. I was thinking about going with godaddy since their plan is only like $5 / month, but was wondering if there any others out there that are just as cheap if not cheaper"} {"_id": "27139", "title": "Approach to access server requirements for PHP app", "text": "> **Possible Duplicate:** > How to find web hosting that meets my requirements? We're going to be launching an app that we believe will ultimately generate a couple thousand page views a day, but on day one it's gonna be close to zero. What I'm trying to figure out is how we can access our minimum hosting requirements. It's a PHP app on the CodeIgniter framework that uses PrinceXML to produce PDFs. Does it 'generally' make sense to start on virtual dedicated server rather than the cloud? Is there a way to access whether our server will need 2GB or 4GB things like that? I can find my way around Plesk, but I have near zero experience with Linux or cloud hosting and trying to learn as much as I can before we launch/ Any advice as to the best way to approach this will be greatly appreciated."} {"_id": "21275", "title": "How much bandwidth do I need? Please help?", "text": "> **Possible Duplicate:** > How to find web hosting that meets my requirements? I'm an inexperienced webmaster trying to estimate hosting requirements for a site. I need to work out a dollar amount for a grant. The tricky part is that traffic fluctuates dramatically throughought the year, because we're an international day of celebration. That means most of the year, we get ~50 visitors a day. Then in the months ramping up to the day, we get between 200-600 visitors a day. In a three day span at the peak, we can get 5,000-200,000 a day. Last year the site crashed because of this and I want to avoid it this year. I'm writing a grant to get better hosting, but I have no idea what the best solution for this would be. I need to have a solid dollar amount it will PROBABLY have if things go the way they did last year. Any help is greatly appreciated. EDIT: we're in the USA. Right now we have our site hosted on GoDaddy but I don't have the details. The site handled the traffic okay right up until June 8th, when it crashed. We don't have an accurate measure of how many people visited, but our sysadmin thought it was a DDOS attack at first. He said it was around 200,000 at once. The site is http://www.WorldOceansDay.org Note: we also use CloudFlare to cache and serve our pages, not sure how that may affect it"} {"_id": "19439", "title": "Choosing a webhost", "text": "> **Possible Duplicate:** > How to find web hosting that meets my requirements? I'm having trouble choosing a web host. I only need a VPS to start my project, but I want to be able to expand to a dedicated server later when traffic is growing without much hassle. Things of importance: * quality and fast service * price * phpmyadmin * cpanel Any suggestions? EDIT : I live in Norway, but the site is going to be international."} {"_id": "1217", "title": "Whats is the best Windows VPS hosting?", "text": "> **Possible Duplicate:** > How to find web hosting that meets my requirements? I'm currently using shared hosting. I want more control over my IIS and also I need to run in full trust. There are a lot of options out there for Windows VPS hosting. Which ones do you recommend is the best? Some must haves * Has to have great support * Automatic hardware fail overs * Access through Remote Desktop (you would be amazed some don't offer this) * No limit on what I can install on it"} {"_id": "18677", "title": "Free HTTPS hosting with PHP support", "text": "> **Possible Duplicate:** > How to find web hosting that meets my requirements? I'm looking for free hosting with HTTPS and PHP support. Of course a subdomain is enough. Something like Google's appspot.com, but with PHP support."} {"_id": "9945", "title": "Free JSP/Spring MVC Web Hosting Site", "text": "> **Possible Duplicate:** > How to find web hosting that meets my requirements? I am thinking of hosting a small web application built using Spring MVC. Does anybody know any free web hosting sites that supports JDBC also? I haven't tried web hosting site so I would like to know one site which sites are free and ok. My app wont take so much disk space and would like to know if there are sites that runs on Tomcat. Thanks."} {"_id": "20325", "title": "Help me finding a hoster that allows MySQL Replication please", "text": "> **Possible Duplicate:** > How to find web hosting that meets my requirements? Now as I found out here that I should use Replication for my project, I need a Hoster that allows me to do so. Unfortunately my current hoster doesn't unless I buy a real expensive package. Currently I'm paying around 50$ a month and I wouldn't like paying much more as I don't need many exclusive features apart from replication (got maybe 5000 visitors on a good day). * I don't want to spend much money, as I don't have much traffic (maybe a few thousand unique visitors on a good day) * I actually need only a fast MySQL DB to set up for replication * Preferably in germany as my visitors come from germany too I just can't find any information about that on hosters webpages, and I thought it might be faster to ask others before writing asking every hoster personally. Anyone can help me here ? What hoster are you using for MySQL DB replication?"} {"_id": "5306", "title": "django & postgres linux hosting (with SSH access) recommendations", "text": "> **Possible Duplicate:** > How to find web hosting that meets my requirements? We're looking for a good place to host our custom Django app (a fork of OSQA) and its postgresql backend. Requirements include: * Linux * Python 2.6 or (ideally) Python 2.7 * Django 1.2 * Postgres 8.4 or later * DB backup/restore handled by the hoster, not us * OS & dev-platform-stack patching/maintenance handled by the hoster, not us * SSH access (so we can pull source code from GitHub, so we can install python eggs, etc.) * ability to set up cron jobs (e.g. to send out dail email updates) * ability to send up to 10K emails/day * good performance (not ganged up with a zillion other sites on one CPU, not starved for RAM) * FTP or SCP access to web logs * dedicated public IP * SSL support * Costs under $1000/month for a relatively small site (<5M pageviews/month) * Good customer service We already have a prototype site running on EC2 on top of a Bitnami DjangoStack. The problem is that we have to patch the OS, patch postgres, etc. We'd really prefer a platform-as-a-service (PaaS) offering, like Heroku offers for Rails apps, where all we need to worry about is deploying our code instead of worrying about system software patching and maintenance. Google App Engine is closest to what we're looking for, but they don't offer relational DB access (not yet at least). Anyone have a recommendation?"} {"_id": "62955", "title": "Dropbox vs Google Drive vs other for free static cloud hosting", "text": "I need to host a static website, single page deal, and instead of paying for hosting I thought I might use a cloud service, however an issue I can see of this is bandwidth allowance. Does anyone know which allows more bandwidth per link, Dropbox, Google Drive or another service?"} {"_id": "13757", "title": "Need new secure host", "text": "> **Possible Duplicate:** > How to find web hosting that meets my requirements? My website is being attacked too much lately and my provider is unresponsive. Logs show dozens of attempts to run mysql setup.php scripts from various locations, then the logs vanish. Time to find another host. I lack the skill to config and run my own server -- actually, I'd have no problem setting it up and running it, I'm just not current on Linux security -- so I must rely on a competent host. Can anyone recommend a Linux-based hosting service with solid security and reasonable bandwidth rates? I will be selling a program so there will be a large number of downloads. I am in Canada, so USA or Canada is preferred. All advice is welcome. Thanks."} {"_id": "42629", "title": "\u201cFile does not exist\u201d in Apache error log when mod_rewrite is using", "text": "I am getting below error in server log, when re-writing the URLs. [Fri Jan 25 11:32:57 2013] [error] [client ***IP***] File does not exist: /home/testserver/public_html/testing/flats-in-delhi-for-sale, referer: http://domain.in/testing/flats-in-delhi-for-sale/ I searched very where, but not found any solution. My _.htaccess_ config is given below: Options +FollowSymLinks Options All -Indexes ErrorDocument 404 http://domain.in/testing/404.php RewriteEngine On #Category Link RewriteRule ^([a-zA-Z]+)-in-([a-zA-Z]+)-([a-zA-Z-]+)/?$ view-category.php?type=$1&dis=$2&cat=$3 [NC,L] #Single Property Link RewriteRule ^([a-zA-Z]+)-in-([a-zA-Z]+)-([a-zA-Z-]+)/([a-zA-Z0-9-]+)/?$ view-property.php?type=$1&district=$2&category=$3&title_alias=$4 [NC,L] I also found similar old dated question, but no answer (\"File does not exist\" in apache error log). Thanks in advance for your help. P.S.: My site is working fine even Apache log is showing the error."} {"_id": "53905", "title": "Google Cache showing wrong URL", "text": "I searched the cache details of the URL `http://property.example.com/pune- properties` but the Google Cache showing details for `property.example.com`. I don't know why it's showing like this. Not only for `http://property.example.com/pune-properties` but also for all the Indian city relates URL's like `http://property.example.com/chennai-properties` , `http://property.example.com/mumbai-properties` , `http://property.example.com/kolkata-properties` etc. Even I don't find these URLs in the Google search result. If I search Chennai properties in Google, I find `property.example.com` and not `http://property.example.com/chennai- properties`. Why its happening like this?"} {"_id": "6793", "title": "Make content area appear on the right of the menu_bar", "text": "This is my css code, i would like to make the content_area come to the right of the menu_bar, at the moment it comes at the bottom of the menu_bar body{ color:#000000; font-family: arial, san serif; } #container { margin-left: 10%; margin-right: 10%; border: 1px solid #46A5E0; width: 80%; } #header { margin-left: 0.5%; margin-right: 0.5%; margin-top: 0.5%; margin-bottom: 1%; border: 1px solid #46A5E0; width: 99%; height: 10%; } #pathway{ margin-left: 0.5%; margin-right: 0.5%; margin-top: 0.5%; margin-bottom: 1%; border: 1px solid #46A5E0; width: 99%; height: 3.5%; } #menu_bar{ margin-left: 0.5%; margin-right: 0.5%; margin-top: 0.5%; margin-bottom: 1%; border: 1px solid #46A5E0; width: 20%; height: 40%; } #content_area{ margin-left: 25%; margin-right: 0.5%; margin-top: 0.5%; margin-bottom: 1%; border: 1px solid #46A5E0; width: 20%; height: 40%; } #footer { margin-left: 0.5%; margin-right: 0.5%; margin-top: 0.5%; margin-bottom: 1%; border: 1px solid #46A5E0; width: 99%; height: 3.5%; } This is the HTML This is my first page

This is my header

My pathway goes here
this is my footer
This is making the content_area go below the menu_bar how do i bring it to the right of the menu bar."} {"_id": "27372", "title": "Web hosting company basically forces me to use their domain name", "text": "I've recently stumbled upon an unusual problem with one of hosting companies called giga-international.com. Anyway, I've ordered com.hr domain from Croatian domain name registration company, and my client insisted on using this host provider as couple of his friends already are hosted with them. I thought something was fishy when the first result on Google for Giga International was this little forum rant instead of their webpage. When I was checking their services they listed many features etc... space available, bandwidth etc. I just wanted to check how much ram do I get for my PHP scripts so I emailed them, and they told me that was company secret. Seriously? Anyway, since my client still insisted on hosting with them I've bought their Webspace package. During registration I **had** to choose free domain name because I couldn't advance registration without it. Nowhere was said, not even in general terms and conditions that I wouldn't be able to change that domain name. At least not for double the price of domain name per year. They said I can either move my domain name over to them (and pay them domain registration), or pay them 1 Euro per month for managing a DNS entry. On any previous hosting solution I was able to manage my domain names just by pointing my domain to their name servers, and this is something completely new and absurd for me. They also said that usual approach is not possible because of security and hardware limitations. I'd like to know what you guys think about this case, and should I report, and where should I report this case. **In short. They forced me to register free domain name which doesn't suit my needs in order to register for their webspace package, and refuse to change domain name for my account until I either transfer domain to them or pay them DNS management which costs double the price of the domain name per year.**"} {"_id": "27373", "title": "Forum Spam Question. Why are Chinese hackers posting tiny images on my forum", "text": "My forum is getting attacked by spammers with Chinese IP's. Replies are created to discussion topics, always short responses and along with the response an image is inserted using forum code which is then modified by the forum to an HTML image tag (no anchor tag). > So beautiful picture [img]http://www.coupon-domain-goes- > here.com/avatar2.jpg[/img] The replies are created by humans not bots because they are slightly relevant to the discussion. The images do not display so regular forum members have no idea that this is spam. the reason they don't display is because the images return a 302 response code and redirects to a URl like this > http:// www.coupon-domain-goes-here .com/avatar.php?u=2 which then redirects with a 302 to another image on the same domain which is also called avatar.gif. > http:// www.coupon-domain-goes-here .com/images/avatar.gif My question is why are they doing this. Is there an SEO benefit. There is no link created. It's only an image so the URl in the middle which is a PHP file should not be getting any link juice from search engines. Or maybe I am wrong? What do you think?"} {"_id": "61245", "title": "Will using two different tracking codes affect my SERP", "text": "Hello everyone and thanks for your time! I am now facing a problem after a site migration. New site is basically an improved version of old site, with the same content and some extras. After pointing the domain name to the new site, the old site was still online for a while but didn't get any traffic. The new site has its own tracking code. So, old tracking code has age (something like 7 years) but no visitors for a month, but new tracking code is a month old with an acceptable traffic. How to you think google will react if I add old tracking code to new site? Thanks by advance!"} {"_id": "25745", "title": "Which Content Management System (CMS)/Wiki should I use?", "text": "_This is a general, community wikicatch-all question to address non-specific \"I need a CMS or Wiki that does x, y, and z...\" questions._ _If your question was closed as a duplicate of this question and you feel that the information provided here does not provide a sufficient answer, please open a discussion onPro Webmasters Meta._ * * * I have a list of features that I want for my website's Content Management System (CMS) - where can I find a [free] script that includes all of them?"} {"_id": "16930", "title": "Blogging platform that supports guest-posting", "text": "> **Possible Duplicate:** > Which Content Management System (CMS) should I use? I'm looking for a blogging engine (hosted or downloadable) that will allow: * guest posting (anyone can register an account and post) * file-attachments (you can attach files to posts)"} {"_id": "8438", "title": "Which Bliki (Blog+Wiki) solution can you recommend?", "text": "> **Possible Duplicate:** > Which Content Management System (CMS) should I use? I'm searching for a good Bliki solution, meaning a combination of blog and wiki that I can install on my own web space. I would like to be able to write articles in the wiki style much like with media wiki. So I want to use a wiki markup language, have a revision history, comments, internal links to other pages (maybe in other languages) and be able to collaboratively edit the articles. On the other side I would like to have a blog-like view on my articles, showing new articles (and changes to existing articles) in a time ordered fashion. It would be nice if it would be possible to search through the articles and also tag the articles, so one could generate a tag cloud for the articles. A nice feature would also be to be able to order the articles according to views or even a voting system for the articles. Good would also be a permission system to keep certain articles private, showing them only to people logged in to the platform. Apart from these nice to have features an absolute must have feature for the Bliki platform I'm searching is the possibility to handle math equations (written in LaTeX syntax) and display them either as pictures like media wiki or even better using Mathjax. At the moment I'm using a web service called wikiDot which offers some of the mentioned features, however the free version shows to much advertisements, the blog feature is not mature, the design is quite ugly and loading of the page is often slow. So I want to install a Bliki solution on my own webspace. Can you recommend any solution for that?"} {"_id": "21144", "title": "Are there other .NET based Content Management Systems (CMS) beside DotNetNuke that support multiple sites with one install?", "text": "> **Possible Duplicate:** > Which Content Management System (CMS) should I use? I know that there are a LOT of questions on SO about CMSs based on .NET but I have just one specific question. I know that DotNetNuke supports multiple site creation on one installation. Of the other wellknown .Net CMSs... N2CMS Composite C1 Umbraco Orchard AxCMS ...do any of them support this feature out of the box (or with relative ease)? BTW, if you know of some low-cost non-free CMSs that would support this feature don't hesitate to give them a mention (as long as they are built on the Microsoft stack.) **EDIT** Just learned that this feature is called multi-tenancy...thanks David. Thanks for your answers. Can anyone give clarity on whether N2CMS, Composite, or AxCMS support multi-tenancy. Seth"} {"_id": "8412", "title": "Which CMS for photo-blog website?", "text": "> **Possible Duplicate:** > Which Content Management System (CMS) should I use? I need to add photo-blog to a site that I'm recently working on. It is very simple site so the blog doesn't have to be very sophisticated. What I need is: * a CMS that allows me to create simple blog-like news with one (or more) images at the beginning and some description/comment below. Preferably, I would like to create something that works like sites like these two: http://www.photoblog.com/dreamie or http://www.photoblog.pl/mending/ * it must be customizable. I want to integrate it's look as much as possible with current page: http://saviorforest.tk * preferably, it should provide some mechanizm for uploading and storing images at the server. I thought about wordpress, but it seems to be a little bit too complicated for such simple task. Do you know any simple and easy in use CMS that would work here?"} {"_id": "61090", "title": "Best CMS (Content Management System) for Technical & Creative Guildelines", "text": "Wondering if anyone here has any idea of a good content management system to act as as Technical & Creative Guidelines documentation site, preferably one with easy customization and strong indexing and search. Not web-facing, meaning it will be running within a Virtual Private Network (VPN) as well as Intranet. Any suggestions or ideas would be more than helpful!"} {"_id": "12389", "title": "Website Review CMS (similar in nature to Forexpeacearmy.com)", "text": "> **Possible Duplicate:** > Which Content Management System (CMS)/Wiki should I use? I was wondering if anyone know of any CMSs that are similar to Forexpeacearmy.com ie Different categories with lists of websites in them. The ability for users to review websites. Thanks"} {"_id": "7549", "title": "What platform do I need? Wiki, Tiki, or something else", "text": "> **Possible Duplicate:** > Which Content Management System (CMS)/Wiki should I use? I need to set up a collaborative website, like a wiki but where the items have a predefined structure. For example, a collaborative recipe website where a submission would consist of a photo, ingredients, instructions, cooking time, etc... Is there any any open source platforms out there that allow collaborative editing with structured entities? Many thanks!"} {"_id": "3398", "title": "Best CMS for review-type sites", "text": "Is there an ideal CMS for making a review site? By review site, I mean like a restaurant review site where you have each entry belonging to different major categories like Cuisine and City. Then users can browse and filter by each or by combination (Chinese Food in Los Angeles, with suggestions of other Chinese restaurants in LA, etc). Furthermore, I'd want it to support other fields like price, parking, kid-friendliness, etc. And to have users be able to filter by those criteria. I've been told that with a combination of custom taxonomies, plug-ins and many clever little queries, that Wordpress 3.x can handle this. But I'm having a heck of a time with it getting into the nitty gritty, and that's where I find the community support is lacking. The sort of stuff you'd think would work in WP, like making one parent category for Cuisine and one for City, don't really work once you get further in and start trying to pull it all together. Then you find these blog posts where people say, \"This example shows that one could create a huge movie review site using custom taxonomies...\" but when you go and try it you hit all sorts of challenges and oddities that point a big long finger at Wordpress being in fact a blogging platform. The best I came up with was one category for the cuisine and one tag for the city, then I created a couple of custom tag-like taxonomies for the other features. It's quite a mess to try to figure out how to assemble all of that into a natural, intuitive site. I expect a few versions down the road WP will be able to do these sorts of sites out of the box. So I thought I'd take a step back before I run back into the Wordpress fray and find out if maybe there is another platform better suited to this sort of relational content site. Directory scripts in some ways offer many of the features I'm looking for, but I need something more flexible and, hopefully, interactive (comments, reviews). I'm especially looking for feedback from people who've crafted sites like this. Thanks!"} {"_id": "16949", "title": "PHP / Database Advertising Directory", "text": "> **Possible Duplicate:** > Which Content Management System (CMS) should I use? What would it cost or is there already a system with a CMS like Joomla that would create the back-end of a website like this one: http://www.appliance- appointment.com/"} {"_id": "42802", "title": "Looking for a \"database/dashboard\" CMS. Any tips?", "text": "> **Possible Duplicate:** > Which Content Management System (CMS)/Wiki should I use? I'm looking for a CMS that can handle: * User types at 3 authorization levels (client, partner, admin) * a dashboard for each one of them with relevant info about the other 2 user levels Concretely I am building a website where gardeners can set up a work schedule for their clients where the client can see when the next meeting is and its cost; the gardener its next meetings and monthly income etc. The admin obviously has an overview. I don't believe any CMS exists that covers such a use case but might there be any that can help me get started quickly? Thanks for your tips guys and don't hesitate If I haven't been clear!"} {"_id": "16837", "title": "What are some alternatives to ASI iMIS Content Management Systems?", "text": "> **Possible Duplicate:** > Which Content Management System (CMS)/Wiki should I use? I am working with a team to select a new content management system for a large membership organization (around 25,000 members). The organization has revenue so I'm not looking for a dirt cheap solution. The site currently uses ASI iMIS which is based on ColdFusion. It's difficult to work with and not flexible for our needs. What other possible alternatives to ASI iMIS are there? Ideally the solution would have some sort of support from the vendor. So far I've come up with: * Drupal/Acquia * SDL Tridion * Plone * Ellington (probably too news like) * Pinax (probably not developed enough)"} {"_id": "67783", "title": "Online Marketing Website", "text": "Which CMS site would be the best to use for building my new website. I am currently using Wix, would this be suitable? There will be loads of blogs, constant picture uploads and press releases added on a weekly basis."} {"_id": "12929", "title": "Web-based CMS for mobile app", "text": "I'm just about to start developing a mobile app which needs to be fed from a CMS. I started designing the tables when I thought there must be something out there which could save me a load of time and let me concentrate on the mobile side of things. So, I'm looking for a CMS that will let me create hierarchical \"pages\" which will just be 4-5 database fields with a simple front-end to allow to edit and update them. I don't mind having to write some code to layout the database and forms etc, any saving on starting from scratch would be good. The only requirement is that I be able to access the data via some sort of web service, REST, JSON, XML, anything really... Can anyone suggest anything that might help? Thanks, J"} {"_id": "15553", "title": "Do you know a good web CMS to manage a sports team?", "text": "> **Possible Duplicate:** > Which Content Management System (CMS) should I use? I'm looking for a web based CMS that enables me to manage a sport team, I need the following features: * Calendar** * Schedule events (sync with the calendar, RSS feed), it would be great if I could schedule a weekly event too, so that I don't have to schedule it by hand each week** * Announcements (same RSS feed as events)** * A place where I can put some documentation, rules** * Keep track of the matches and scores * Photo and video gallery ** means feature is required; otherwise optional Any technology for the CMS is probably fine, though I would prefer an SQLite- based CMS."} {"_id": "28499", "title": "PHP photo/album gallery script with an admin backend and easy to integrate", "text": "> **Possible Duplicate:** > Which Content Management System (CMS) should I use? I'm looking for an open source photo and album gallery software that should be is easy to integrate in a webpage and have an easy-to use back , admin panel for adding/deleting/modifying photo/albums and thumb resizer . I've found myGallery which basically does what I want but it doesn't cache photos and is **really** slow."} {"_id": "26150", "title": "Which eCommerce has a good module with product configurator?", "text": "> **Possible Duplicate:** > Which Content Management System (CMS) should I use? i'm looking for a eCommerce software which have also a good \"product configurator\". My company sells product highly configurable. Which ecommerce has a good configurator ? Thanks"} {"_id": "24115", "title": "Looking for quick and simple blogging solution", "text": "> **Possible Duplicate:** > Which Content Management System (CMS) should I use? I'm looking for a really quick \"plug in and play\" style blogging software for php. It's to be used on a website that has already been design and deployed and it's really going to be used to add that extra bit of functionality. The things I really need it to do are: * Manage blog posts (including some form of editor). * Display blog posts on arbitrary webpages. * Unobtrusive to implement. * Nothing more. This basically rules out wordpress/drupal. They are far to overburdened and I don't want to have to port the website to a word press theme. The blog is supposed to fit around the website not the other way round. Yes Django would be perfect for this job however, the environment in which the website is hosted does not allow me to use it. If anyone knows of anything similar for php that would be helpful. Does anyone know of anything like this? If not I will just write something basic but functional but it would at least save me a little time and effort."} {"_id": "38849", "title": "Which eCommerce cms provide freedom of using my own eCommerce HTML template?", "text": "> **Possible Duplicate:** > Which Content Management System (CMS)/Wiki should I use? Which eCommerce CMS provide freedom of using my own eCommerce HTML template? We work on asp.net, however, we are okay with other languages as far as zero coding skill is required."} {"_id": "26632", "title": "Could any current, well supported CMS support a live action role playing game?", "text": "A group of friends and I are looking to create a system which will support and allow interactions between a group of up to 70 people using mobile devices (phones and tablets) as part of a science fiction themed freeform game. In order to minimize development effort and support as wide a range of devices as possible, it's become clear that a web server based infrastructure is the only practical solution. The server would be connected to a wireless network, and the players would use their web browsers to access the server wirelessly. I think the only thing that will do the job is some sort of CMS, but I have no detailed experience with any CMS and as such could do with some advice as to suitable platforms. **Required functionality** is: 1. Some sort of wiki containing game information. 2. The ability for both players and referees to be able to identify themselves to the system by log in. 3. Access control to a per user and per page granularity (not all users will have access to the same pieces of information) 4. (Text based) messaging (essentially email) from within the web interface 5. A notebook page for each player to make their own notes. **Required user interface features** are: 1. A UI optimised for mobile and touch devices 2. The ability to increase the text size of the interface for each device 3. Not too complex to use - some of our end users will not be technically minded The **required infrastructure** is: 1. Extensibility without a huge learning curve - we have several people with a programming and/or systems administration background, but this is a hobby project for all of us, and none of us are CMS or web app gurus. In particular we are likely to need to develop a script to change users access permissions if they scan a QR code at some point in the game (giving them access to new information). 2. Longevity - this system is likely to be in use for at least 5 years (in short bursts, but not being able to maintain or set up a new server in 3 years time would be a real problem). Things which are **desirable** , but are not essential 1. Some sort of off-line operation (don't think we're going to get this one, but it doesn't hurt to ask) 2. Built in voice/video conferencing/messaging 3. Ability to tie a log-in to a device so the user only has to log-in once (mobile web browsers often being killed by the OS task manager the moment the user switches to something else) 4. Something that looks good on a tablet as well as a phone screen What sort of solutions would people suggest if they needed to develop a system with these requirements?"} {"_id": "1540", "title": "I need an alternative to Pligg.com", "text": "> **Possible Duplicate:** > Which Content Management System (CMS) should I use? I was working with Pligg - The Social Networking CMS and it seems like a great software. But there are few things that are not so working well for me. The content scrapper module is not perfect and has lots of bugs. If anyone knows any other alternatives? All i am trying to do is create DIGG like website. Thanks"} {"_id": "22362", "title": "Multi language CMS", "text": "> **Possible Duplicate:** > Which Content Management System (CMS) should I use? need very simple multi lingual CMS. Currently site is in pure html. So all pages are written in pure html. No dynamics, block, etc. What I need is: I can create 3 versions of site and add content to it. That's all. Currently I think of Wordpress+qtranslate or TribiqCMS. But I am not sure right now."} {"_id": "43941", "title": "Help with a CMS for content only not display", "text": "Hello I'm trying to make some kind of tool for an school website, what I need to do is to make students take a test and according to what are the results (27 posibilities) they get a set of activities (questions) according to their level which they can solve in around 3 months logging periodically to the website, plus I need teachers to log and look at the reports. Now, I'm a graphic designer myself so my skills are mostly html5 and css3 and I know some php (edit existing ones only) and javascript (jquery) as well, most people tell me that I need a CMS to do the tool but all I find is CMS for display like blogs or news websites which I think aren't useful for me because the website is already made in html and css3 only (I need to add an extra page for the tool) I understand I need to create users and give them special rights according to what type of user they are and I also understand that I need a database where I can store all my questions. What is the best way to do this? what do you suggest me? Thanks"} {"_id": "53026", "title": "CMS for sharing media files?", "text": "I was looking for a CMS to host my uploaded files, share the download link and view a report of how many times a file was downloaded. Is there any CMS that I could instal in my Host? Wordpress, Drupal and Joomla just not working for what I want, I am looking for similar as megaupload, mediafire (who has a backoffice view, front-end, etc). Thanks!"} {"_id": "54836", "title": "booking and reservation content managment system", "text": "I'm looking a cms with a can: Make booking and reservation for a event or restaurant. payment system and open source if is possible.. Thank you all"} {"_id": "21371", "title": "Custom Upload Advanced Scripting CMS", "text": "> **Possible Duplicate:** > Which Content Management System (CMS) should I use? I am looking for a specific content management platform that would display themes for my application. Requirements are as folllows: * Any user can upload content, but has to be approved by an administrator * When the user uploads the content, an external application is called to generate a thumbnail I could create this using codeigniter or something, but I would much prefer to use an existing system. I have experience with Drupal (seems a little bloated for my needs), and Wordpress (I'm using it as main website right now). Maybe I need a plugin for WordPress instead of another CMS. WordPress currently blocks uploads of my file type. I can modify it, but it's a pain to update it every time WordPress has a new release."} {"_id": "39731", "title": "Which CMS or framework for a website with classified ads?", "text": "> **Possible Duplicate:** > Which Content Management System (CMS)/Wiki should I use? A non-technical friend has asked me to provide some guidance for his new website idea. To avoid losing a friend - I intend to send him to another professional to setup and maintain his website. The idea is not revolutionary - a basic website with user submitted ads, but given it is for a niche he might get some traction and I wish him good luck with that. Given that functionality is really very standard, should I suggest to him that he goes with something fairly mainstream such as Joomla or Wordpress that can be easily hosted on shared hosting? The only data that we want secure is the emails and phone numbers of the ad submitters. Any particular steps to take there? The other major req is that the site works well on mobile devices. Which particular product would you suggest and why? I also welcome any general suggestions on products or approaches that he can take."} {"_id": "18141", "title": "Best content management system for a web hosting provider website?", "text": "> **Possible Duplicate:** > Which Content Management System (CMS) should I use? Which is the best CMS for a hosting provider website? 1. Wordress (Is this Not fit for websites?) 2. Joomla (low security) 3. PHP Coded pages 4. Drupal"} {"_id": "16265", "title": "Best CMS for a membership site?", "text": "I have been told Joomla but I am partial to WordPress. Can anyone let me know the best membership CMS, its plus points and any big sites running it? This is for a non-profit members only site mainly dealing with tutorials, stock galleries, templates, blog posts and support forums."} {"_id": "30444", "title": "I want something ready to start with", "text": "> **Possible Duplicate:** > Which Content Management System (CMS) should I use? I am looking for something quick like a weblog in wordpress or blogspot maybe, that when I write a blog post I can put it there, for example if I write something about .NET or Java or Database,..some quick tutorial with some small code samples that visitors can use ... And I don't know anything about webdesign and I just want a ready-made thing to use for this purpose. What do you suggest? any samples of that that I can take a look?"} {"_id": "36409", "title": "php cms withouth database", "text": "> **Possible Duplicate:** > Which Content Management System (CMS)/Wiki should I use? A friend of mine asked me for a easy website for him. As it was just 3 subpages, no database need or anything, I've done it quite fast with plain PHP, HTML, JS and CSS. But then another and another friend showed up. They only wanted arrange differently they navigation, some different picutures etc. So I though, is there any kind of CMS, that allows build small business website (barbershop, local groceries shop), with small list of subpages, yet allowing us to arrange look'n'feel. There's no need for the databas as content won't change, or can be stored in simple textfiles for example. Things like WordPress might be simple overkill Does anyone know such cms?"} {"_id": "23494", "title": "CMS for code snippets?", "text": "> **Possible Duplicate:** > Which Content Management System (CMS) should I use? I need the most simple way to publish code snippets in PHP, Javascript, HTML and CSS. What is the best free CMS to use with PHP / MySQL with these requirements? * Made for code snippets, or very good at it. * Syntax highlighting for the code snippest, when published but even when writing if possible. * Possible to write some documentation around the code snippet. * Nice urls, like /html/input-search/ * Tags or categories for code snippets. I'm using Wordpress for other things but I don't think it's simple enough for code snippets."} {"_id": "13522", "title": "Script for a community blog", "text": "> **Possible Duplicate:** > Which Content Management System (CMS) should I use? I want to start a community blog: a blog where anyone can post. Basically: I need a CMS that has the following features: * People can post from the front end: they don't have to go to an administrative interface. * Anyone can post * The popular posts (the ones with the most views, links, comments, etc.) are moved up to the home page, while the less popular posts are not as visable. * Categories and Tags * Comments * Some sort of internal trackback system: if user1 creates a post with a link to user2's post, then user1's post should show up in the comments of user2's post. * a simple profile (name, avatar, bio, etc.) * The ability to rate post's and comments. * Some sort of anti-spam system. Technology: php Any suggestions?"} {"_id": "1622", "title": "Pligg like CMS to create digg like sites", "text": "> **Possible Duplicate:** > I need an alternative to Pligg.com I am trying to find Pligg like CMS that I can use to create Digg like sites. With Pligg, I am not able to import the links automatically. They even have the plug in. I read about that plugin and it looks it can fetch only from selected sites. I'm not sure how good the plugin is. I checked out Drigg from Drupal but I don't understand the installation steps properly and I am not able to find documentation on doing this. Can anyone please suggest a CMS like Pligg that can pull bookmarks from other places?"} {"_id": "56443", "title": "New to CMS - Media Uploads & Profiles?", "text": "I am a designer / developer. My experience is in `HTML/CSS/JS`, some knowledge in `PHP`, `MySQL`, and Wordpress one-click-install. It'd be helpful if you could share some experience in choosing and setting up a CMS. Ideally, I have two places/needs for content management: 1. A media gallery in `HTML/CSS/JS` that can populate with images from a CMS image upload. 2. A series of 'profile cards' with fields I can customize in the CMS. A name, a title, a bio, an image etc. And then be able to display that content in customized `HTML/CSS`. These two features, along with a well-designed intuitive admin panel are the requirements. I'm still very new to backend installing and hooking into a CMS. Drupal is my first choice since it has a large community and I like that it has extendable modules. So, can this be accomplished with Drupal -- or is there an alternative I can look into?"} {"_id": "10187", "title": "CMS for a site with blog, login, shopping cart, etc", "text": "> **Possible Duplicate:** > Which Content Management System (CMS) should I use? I am in the planning process of a website that will have a blog (as well as other pages which will be static), a login area (so that registered users can access private content), a shopping cart, and possibly later on a forum. What would be the best CMS to use for all of this?"} {"_id": "53760", "title": "Some sites with wiki-engine in the web for hosting", "text": "Are there any sites in the web like Wikipedia and this site, which have wiki- engine and may be used for hosting? Wikipedia has some troubles nowadays with formulas tags converting, so I want to use the other site."} {"_id": "50410", "title": "Is there any Open Source alternative to e-paper CMS?", "text": "I'm looking for an open source CMS such as * Express News * Dawn E-Paper Similar question has already been asked but its still without a proper answer."} {"_id": "25652", "title": "Need a CMS for a church - sermons, galleries, front-end/simple editing", "text": "> **Possible Duplicate:** > Which Content Management System (CMS) should I use? I'm creating a site for a church (news, events, sermons(audio & video), group pages, etc.) For many clients, I use Drupal, Joomla, or Wordpress - however, none of these will work in this scenario. I need the church administrators to be able to create layouts easily, etc, and not be bogged down in the number of features. Often for these types of clients, I use Joomla, but editing content in multiple areas can be confusing for clients (is it content, a component, or a module, etc.). Drupal is great for form based content submission, but the rest would be overkill for this client. I was leaning towards concrete5 for it's simplicity of use for the end-user, but many of the concrete5 add-ons feel like the alpha/betas of other CMS's extensions/add-ons. I was also tempted by Fork CMS, but it's fairly new and doesn't offer a lot in the way of customization abilities. So what would be a good CMS that allows either front end editing, or the backend is intuitive - offers good solid extensions/add-ons, and is mature and sees regular updates."} {"_id": "23577", "title": "Light CMS that does not require a database", "text": "> **Possible Duplicate:** > Which Content Management System (CMS) should I use? I am looking for a light CMS that I can use for creating a simple personal website for a friend. The server that he will host his website only supports PHP and no databases. So, I need a CMS that may not be very scalable but the important thing is that it must be easy to use with a simple user interface that a layman can use to add content (and as I said no databases)."} {"_id": "42228", "title": "Choosing a CMS for Non-Technical Users", "text": "> **Possible Duplicate:** > Which Content Management System (CMS)/Wiki should I use? I am currently looking for a free CMS geared towards non-technical users. I'm hoping to find something where they can easily create tables, move blocks around, with a WYSIWYG editor, etc. Does such a thing exist? Myself, I can put up with any CMS I come across. I can use HTML/CSS/PHP/MySQL/etc., so I can install whatever I need to and add edits wherever I need to for the web design/template/theme. I know the popular ones: WordPress, Drupal, and Joomla. They're apparently not simple enough for what I need."} {"_id": "47794", "title": "We require a blog plugin to our asp.net website?", "text": "We require a blog plugin to our existing ASP.net website. I've used Wordpress before, however as our site is ASP / IIS Wordpress is out (I understand there may be workarounds but would prefer something that works out of the box). We need blog functionality to add in to our existing website template (header, columns etc) so only require a subset of full blog software functionality, as most blog software out there is aimed at creating a whole blog website, whereas we only want it to be a page on our website, in line with the current formatting of the website. What software would you suggest I use?"} {"_id": "30142", "title": "Blogging platform that supports group , private , public blogs and advanced tagging", "text": "> **Possible Duplicate:** > Which Content Management System (CMS) should I use? After some bad questions here, i finally have my main question in mind :) I need to design a platform that provides: -Personal blogging (blogs that are view able only by them) -Group blogging (blogs that only people part of a small group (3-4 people) can see the posts and comment) -Public blogs Also about the content itself, i would need -Advanced tagging (common tags amongst all blogs AND tags specific to a blog in addition to the common) -Search capability -Rich text content with photos media e.tc. on the blog entry -A database that i could access and create custom applications (most read-only) on the data. Also an API would be helpfull. an example of a custom application would be to select your blog posts and create an offline file (.zip html e.t.c) and have it stored locally on your machine. After some research: -Forum platforms are NOT suitable for this as i would have to create hundreds of sub forums with selective security for each of them. It would make administration almost if not impossible. -Joomla and joomla-alike cms also give me bad signs as i couldn't have specialized tags per category of articles and i suspect heavy coding in order to do it. Also i don't know if it could satisfy my other above-mentioned needs. -I have found a combo that may work. Invision Board and IP.Blog. IP.blog supports almost all of the features that i want ,but it seems maybe too good to be true (any input of someone that already uses it would be helpful). Many solutions follow the logic that a blog is public and allow for some level of security on them, which is not suitable for what i want."} {"_id": "24058", "title": "Turnkey software for a community website", "text": "> **Possible Duplicate:** > Which Content Management System (CMS) should I use? I'm looking for some software to run a small community website. Ideally, what I want is: * A wiki * Static pages of some kind, perhaps cms controlled. * Some sort of blogging or news functionality I've been playing with django, but it's difficult to get the results you want when on cheap shared hosting (limited control of httpd.conf) Perhaps some sort of wiki package would provide most of this?"} {"_id": "33706", "title": "What is a good solution for web based course management portal?", "text": "> **Possible Duplicate:** > Which Content Management System (CMS) should I use? I am looking to host a web based course portal in my college. The portal would be used for the following tasks: * Instructors will upload lecture notes, assignments, embed videos(youtube etc.) * Students will consume the content, and in addition upload their assignment submissions. * Instructors and TAs will monitor students progress and give feedback. * Students, instructors and TA's will use a forum on the portal for discussions. Could you please recommend a good web based solution(preferably free) which would satisfy the aforementioned requirements to the most."} {"_id": "19948", "title": "Which is the blog engine for my needs?", "text": "> **Possible Duplicate:** > Which Content Management System (CMS) should I use? I am hosting two websites, which basically contain two filtered views to one blog-like data base. Both sites have their own domains and the files are hosted on the same machine (currently in two directory, but could be changed to one directory easily). Now, as a matter of fact, I am sick of maintaining my own php code just for such a blog, so I thought, there must be a blog engine out there which I can use instead. So I googled a bit and I simply don't know with which engine to start trying out. Can you help me an tell me which engine is able to fullfil my requirements. I don't need a setup/configuration tutorial so something. I will figure that out myself as soon as I know which software to look at. My Requirements: * I want to continue to host the sites by myself. * I want to use different design templates on both blogs. * I want to feed both blogs from one data base and filter the entries, e.g. through tags, so I can have entries showing on only one site or on both sites. * I want to write entries in two languages, so the user can see the whole site in the language he/she selects. However, I want these still to be one blog entries, i.e. if a user comments on an entry I want that comment to show in all versions (both languages and both sites if the entrie is visible on both sites). * I want RSS-Feeds (or similar) of the pages (filtered, similar to the webpages themself). Does anyone know a simple/small CMS which can do that for me without being too huge?"} {"_id": "16046", "title": "Intranet Content Management System", "text": "> **Possible Duplicate:** > Which Content Management System (CMS) should I use? I am looking to find a free CMS that would be suitable for an intranet environment with around 50 users. What suggestions do you guys have? All that is required really is a few pages listing links to different documents on the network and possibly an event calender."} {"_id": "31658", "title": "Any CMS or Framework that support MongoDB as the only database?", "text": "> **Possible Duplicate:** > Which Content Management System (CMS) should I use? I'd like to exercise mongodb in a real world context so I'm wondering what are CMS', or better, Frameworks that support MongoDB as the main database out of the box?"} {"_id": "24420", "title": "Userfriendly CMS for a large organisation", "text": "> **Possible Duplicate:** > Which Content Management System (CMS) should I use? I am looking for a CMS for a large organization. There are many sites managed by many webmasters. The size of the sites range from pretty small to pretty large (60,000+ pages). The choice of technology does not matter, as long as the product meets the following requirements: * Allows complex workflows to accomodate the multiple needs of a lot of clients. (Many levels of revision, translation, approval, etc.) * Very easy for the users (Very easy for developpers too is great but could live without) * Has content history (It is possible to view what the content on a page was at a specific date) * Must be multilingual (At least French and English) * Has a flexible security model * Has good performance * The layout of the pages must be highly configurable (not sure if I should write as all CMS probably meet this criteria) * It must be possible to publish content that respects the WCAG 2.0 AA standard Can you point me in the direction of a CMS that meets all those criterias ?"} {"_id": "45167", "title": "What's the best CMS for my project?", "text": "I'm new to content management systems and I have very little experience in web development. Which CMS do you recommend for the following tasks? * Users can login and create list items * Users can upload files to list items * Users can comment and rate uploads (thumbs up down) * Users are able to change the font size of the page via mouse click * Users are able to export sites to pdf via mouse click * Breadcrumbs * Small and easy to use forum * Users can change the language of the site So far I experimented with Wordpress, but I think it's not the optimal choice! I've to use MySql as database."} {"_id": "18645", "title": "What is the best open source e-commerce application for services?", "text": "> **Possible Duplicate:** > Which Content Management System (CMS) should I use? Normally e-commerce applications are designed for selling products. But I want to setup a website with an open source e-commerce application to sell services, instead products. Which is the most suitable open source e-commerce application to sell services instead of products?"} {"_id": "68867", "title": "Which CMS supports private key authenticated sftp uploads?", "text": "I use EC2 to host a website, & would like to implement a CMS which can logon with a private key file rather than a username/password. In order to keep high security on the server so no one can brute force their way in. When I Google Drupal & Joomla private key authentication, I get nothing. Is anyone aware of a web dev app that supports this? Preferably open source. Thanks."} {"_id": "8769", "title": "Choosing open source vs. proprietary CMS", "text": "> **Possible Duplicate:** > Which Content Management System (CMS) should I use? I've been tasked with redesigning a website for a small academic library. While only in charge of the site for 6 months, we've been maintaining static html pages edited in Dreamweaver for years. Last count of our total pages is around 400. Our university is going with an enterprise level solution called Sitefinity, although we maintain our own domain and are responsible to maintain our own presense. Some background-my library has a couple Microsoft IIS servers on which this static html site has been running. I'm advocating for the implementation of a CMS while doing this redesign. The problem is I'm basically the lone webmaster so I have no one to agree or disagree with my choice. There are also only 1-2 content editors right know for the site but a CMS could change that factor. I would like to use the functionality of having servers that run .NET and MS SQL but am more experience setting up and maintaining open source software like Wordpress or Drupal on web hosts. My main concern is choosing a CMS that will be easy to update / maintain / deal with upgrades (i.e., support) in case I'm not there in the future. So I'm wondering how to factor in the open source CMS vs. a relatively inexpensive commercial CMS decision and whether choosing PHP/MySQL vs. ASP.net framework for development environment will play into my decision. Thanks for any input that can be offered based on the details I've given. Thanks, Jason"} {"_id": "13456", "title": "What is the best CMS for .net?", "text": "> **Possible Duplicate:** > Which Content Management System (CMS) should I use? I need advice on the best CMS for an online shopping website in ASP.NET 4.0. Any advice?"} {"_id": "23173", "title": "Looking for a CMS including Wiki, possibility to comment on individual paragraphs of the wiki and making these comments visible on the same page", "text": "> **Possible Duplicate:** > Which Content Management System (CMS) should I use? I am looking for a specific CMS including Wiki, possibility to comment on individual paragraphs of the wiki and making these comments visible on the same page. The CMS should have a user system that allows to rate users on the quality of their comments. Ideally, the System should allow for different user roles/permissions based on their experience. I looked at the site rapgenius.com and would like to use it for a similar project in which people can contribute texts, work on developing these texts and comment on text passages. Sorry, I am not a software engineer or webmaster but I thought this might be a good forum for my query."} {"_id": "61642", "title": "ASP.NET CMS for link or content sharing with vote system", "text": "I'm Looking for an asp.net cms which gets liks from users and shares them based on tag and stuffs. and I want users to be able to rate and vote each shared content or link. I know what I'm looking for is a special web application but I'm going to build it and I dont' want to start from zero. Do you know an open source asp.net cms for such thing?"} {"_id": "11442", "title": "PHP Image gallery that integrates well into custom CMS", "text": "> **Possible Duplicate:** > Which Content Management System (CMS) should I use? I've been trying to find an image gallery that plays nice with our custom CMS. I've evaluated a number of them, but none of them seems to have the feature list that I would like: * Run on LAMP environment * Free software or low license costs (the website belongs to a non-profit organisation) * Multi-user support * Multiple albums. We're posting concert pictures, and would like an album per event. * Pluggable authentication system. I want to reuse the accounts we have for our CMS. Permissions can be done inside the gallery itself, but I want to have a single sign on solution in a maintainable manner, by writing my own plugin/add-on for the software. * Upload support (multiple images at the same time) And preferrably also: * Can be integrated into a PHP page layout without IFRAMEs * Automatic resizing of uploaded images to a maximum size * Ability for visitors to place comments This combination is proving hard to find, especially the authentication requirement. I don't want to mess around all over the place in the source code to make it use the existing authentication. A plugin would be ideal, but alternatively a well thought out software design that allows for maintainable surgical changes would be acceptable. Any suggestions on which software I should take a closer look into?"} {"_id": "9584", "title": "Which CMS can I use for my project?", "text": "> **Possible Duplicate:** > Which Content Management System (CMS) should I use? I have to build a new website for my client.. and he needs this stuff: A user generated content website.. where users can upload their own videos.. (i will use a video hosting platform with api like fliqz.com) I must manage a big user system.. where users can create their own \"sub- users\"... the site must manage paid subscriptions and payments through paypal.. and other payment gateways... So the question is.. i have to use a pre-built CMS and extend it.. or just a good framework? I've been thinking about joomla, drupal and expressionengine... It's not neccesary to use a open source cms... I've looking at drupal.. but is not as easy to understand and extend... Can anyone help me to make a good decision?"} {"_id": "7322", "title": "How to find a good photo gallery for my website?", "text": "> **Possible Duplicate:** > Which Content Management System (CMS) should I use? For my website I'm searching for a really simple gallery module that looks like the one use by Dropbox. But I'd like to have 2 additional features: allow visitors to make comments and display the number of hits of a photo. I was googling a lot for such gallerys, but could find anyone that really matched my requirements. Could someone reocommend a simple good-looking gallery that fullfills these requirements."} {"_id": "46793", "title": "Updating Old Site To New. Which Content Management System Lets Me Retain My Old URL?", "text": "My old site is done in HTML with .html as file extension. To allow all those users who visit sites by typing a specific page url, and URL structure being good, I do not retain every single old url as possible including index.html as the main entry page even the content management system might start with index.php or anything like that. Could you tell me which content management system lets me pick my own url for each page or entry ? Example : given url 1 by Content Management System as travel/whereonearth.php my old url : travel/whereonearthe.html given url 2 : travel/mars.php my old url 2 : travel/mar.html i will type all old url for each entry page, once the site up, i will delete the old .html pages. This is for users and not for seo or anything like. I don't like .htaccess or anything like that so please list the CMS that allows me to do this."} {"_id": "11316", "title": "fusion news or cutenews", "text": "> **Possible Duplicate:** > Which Content Management System (CMS) should I use? I am looking at using cutenews or fusionnews, neither have the specific option i want which is when a new article is posted to email all my subscribers the last article. I would like to know which one is most customizable to do this or of another application that can do this? i didnt want to use wordpress because i want to custime the layout and page, but if thats the only option then i may stick to wordpress."} {"_id": "24140", "title": "Lightweight CMS for simple website", "text": "> **Possible Duplicate:** > Which Content Management System (CMS) should I use? I'm looking for a very lightweight CMS that allows me to design my pages in the exact way I want. All those I found give you standard templates to work with that are quite limiting. For example, I should be able to have generate a page with one full-screen image with a dynamic text area in its middle, or a toll image on the side with a few text areas scattered around. nothing too fancy... I just need freedom to place the objects wherever I want."} {"_id": "36349", "title": "News Portal CMS", "text": "> **Possible Duplicate:** > Which Content Management System (CMS)/Wiki should I use? I am looking for a specific news portal CMS. I know all the major \"general\" CMS (like WordPress, Drupal or Joomla) and even the less known ones (like TYPO3, Expression Engine, Text Pattern or Concrete5). I'm already working with a Drupal distribution called OpenPublish and another WordPress installation to determine which would be better, but these are more of a Plan B. I would like to work directly with a CMS that was build exactly for this kind of tasks specific to a news / media portal. It doesn't matter if the CMS is commercial (however, I don't want to pay a monthly fee) or free, but I need to be able to use it on my own server / hosting and I need to be able to access it's source code (not to modify it, but to integrate it with future plugins / modules). If you know any CMS that qualifies for this job, please let me know. In the last few days I was all over Google but I couldn't anything worth mentioning."} {"_id": "48850", "title": "A Java based modifyable web shop platform?", "text": "I have a project which requires a web shop. It should be a Java based web shop, since I know Java well. The functional requirements are: * modifiable layout for main/goods pages * modifiable processes like checking out etc. Is there a solution, which would meet my requirenments, which I should prefer? (e.g. something as popular as Magento)"} {"_id": "56419", "title": "Which CMS would Google recommend?", "text": "Which CMS would Google recommend for web design. I tested various CMS systems with PageSpeed and I did not found a CMS that performed close to 100/100"} {"_id": "19627", "title": "Simple blogging software (WP replacement)", "text": "> **Possible Duplicate:** > Which Content Management System (CMS) should I use? I'm sorry if this is not the right place to ask this question, but I think it's certainly better than at stackoverflow.com. I'm planning to redesign my website and looking for a WordPress replacement for my blog. My requirements are: * simple (I won't do a large, complicated website) and lucid * well coded (OOP), extensions, templates, active development (not 3 years old) * nice but temperate text editor (+images, links), tags and categories, page break * easy integration with Flickr, Picasa and social networks * technology: PHP and MySQL, not cloud service (like Posterous or Tumblr) Could you recommend me some blogging software which would meet these criteria?"} {"_id": "13791", "title": "Picture Gallery & Message Forum", "text": "> **Possible Duplicate:** > Which Content Management System (CMS) should I use? Any suggestions for both an Open Source Picture Gallery and also for a relatively simple Message Forum (only low number of users), PHP/Python/Ruby/MySQL or anything else which is available. What I am primarily looking for are applications which can easily be incorporated into our existing website (same header, menu, floatin footer, quick linkd panel etc.) rather than as a full separate webpage. Thanks,"} {"_id": "5540", "title": "Alternative of Wordpress for site with review", "text": "> **Possible Duplicate:** > Which Content Management System (CMS) should I use? can anyone help with alternative of Wordpress for site with review. Need more seriously CMS. Thanks."} {"_id": "22282", "title": "Looking for a specific CMS", "text": "> **Possible Duplicate:** > Which Content Management System (CMS) should I use? I'm looking for an application (web application, like WordPress) that you put your code into (CSS, HTML, etc.) and do something like \"define variables\" in the custom code that your clients can come back later and change without having to see any code. Is there anything like this out there? Even close?"} {"_id": "46994", "title": "Private forum /guestbook to coordinate rides for PHP password protected wedding website", "text": "I created a simple private PHP website for my wedding guests. The website is a single PHP page that prompts the user for a password that was snail-mailed to them with our invitations. If the correct password is entered, the PHP page serves up the HTML content which is otherwise inaccessible to the web. It works great! Now, though, I would like to add a forum, guestbook or some other method for guests to post their travel plans or requests rides from the airport and such. Any solution ideally would be: * **Private** so that people feel comfortable posting cell phones or travel information * **Simple** for less tech savy guests to use (e.g. no captchas, no email confirmations, no registration) * **Easy for me to setup** I would like to avoid setting up MySQL on the server if possible. Things I looked into but gave up on: * **Using a Google Docs spreadsheet** is too difficult because it would require me to solicit email addresses from every guest and then manually invite them to the doc. Any solution should be accessible via a link from my password protected wedding website. * **NoNonesense Forum** This php forum looked promising but it would be very hard to make private from the broader web. * **phpBB** and other more powerful fourms. These are too complicated, and not obviously private either. * **Rideshare.us** This service is close, but its a little complicated and kludgy and it really takes people away to a complicated website. I'm wondering if I can do better, especially for less savvy users."} {"_id": "29869", "title": "Whether to use Joomla or NOT for a site with following specifications?", "text": "> **Possible Duplicate:** > Which Content Management System (CMS) should I use? I want **Joomla developers** perspective for: \" **Whether its advisable or not to use Joomla** \" if I wish to develop a site with following features: 1. Mainly focused in e-learning. (quite similar to http://grockit.com) 2. Registered **students** can: watch/download videos/slides present on the site, -give quiz (either solo or multi-player) on various predefined topics on the portal, -when chosen multi-player, they can chat with the other currently on the same quiz, -get a complete explanation of all questions after attempting the quiz, -see his/her profile/status on the portal. 3. **Teachers** can: can upload questions/slides/videos on the site(through a UI), review questions uploaded by some other teacher, can join a chat-room(quiz) and help students solve the quiz. 4. **Admin** (not necessarily a website developer) can: -approve new teachers requests. -manage contents of the website(through a UI). The key points are: The site won't ultimately be handled by the developers. The future admin may not know any programming at all! Is the compatibility issue on various versions of Joomla a big problem? Is it really easy for person with no programming skills to handle the site afterwards? Is Joomla efficient enough and easy to learn for developers? Are there sufficient number of Joomla extensions/plugins available for developing such a portal. And whether its easy to edit/modify those plugins? What we are capable of developing: 1. All the basic features including personal/group chat,video streaming,and the required coding for development of the UI for students and teachers. 2. Also we presently do not have any time constraints for the development of this portal."} {"_id": "33298", "title": "Which wikis have MarkDown support?", "text": "> **Possible Duplicate:** > Which Content Management System (CMS) should I use? There exist a MarkDown extension for MediaWiki, but it is very buggy. Does anyone know of wikis that support MarkDown?"} {"_id": "24705", "title": "Recommend an open source CMS for single page web site", "text": "> **Possible Duplicate:** > Which Content Management System (CMS)/Wiki should I use? Hi I want to create a single page web site like http://kiskolabs.com/ or http://www.carat.se to display my portfolio. I want to add new products after launching the site without having to edit the entire site. I've looked at opencart (too much for single page site), Magneto (more for ecommerce), Wordpress (couldn't find open source / free templates which i can start from). Can you suggest a CMS which will support the creation of a single page site and allow insertion of new products without having to edit the entire page? I would prefer a CMS which also has open source / free templates which I can tweak for my use. I can do php and mysql, xml. If it is an easier option I can do PSD to site (but don't know much about this at all)."} {"_id": "38668", "title": "Non-dynamic CMS", "text": "Some of the web sites I visit every day (news, sports, etc..), although the content changes very often (several times per day), the URLs always have .html extension, what makes me thing that the content has been generated once, and then published as a static page, rather than generated in every call, or even cached in memory. For example, the fictitious site \"mysports.com\" have a \"futbol.html\" page, and then yesterday Messi gets injured and they have another thing to put in that page, then I presume they post the new item in their CMS system, and automatically a publishing action is triggered aftewards that recreates \"futbol.html\" in a CDN with the new item and probably discard the oldest one. Then the ETag changes and clients will get the new page if they try to access it. (the site is fictitious but this is what I believe happened yesterday in the sports site I read) This would fit in the CQRS approach, and I presume they have a huge performance. I know lots of CMS (WP, Drupal, BlogEngine.net, DNN, etc...), but I have never seen any able of doing this, or at least, I was not aware this feautre. How are called those distributed CMS? Which are the most well known? Cheers."} {"_id": "32530", "title": "File Management CMS?", "text": "> **Possible Duplicate:** > Which Content Management System (CMS) should I use? I'm looking for some kind of a CMS -- or similar software -- for hosting non- web files (PDFs, Docs, xls, other proprietary formats, etc) for download, in a protected area. Specifically, I'm looking for a few features: 1) User login required 2) Logging of downloads (who downloads what, when, and how many times) 3) Fairly easy for administrators to upload new content Basically, users would login to the site, search for files (or see what they've downloaded in the past) and be able to download these files. Non- authenticated users can access none of the content. I've been thinking almost something like Dokuwiki would do the trick, though it lacks some features I need. Anybody know of anything that might fit the bill?"} {"_id": "56279", "title": "I want a multilanguage community membership CMS", "text": "I have built a website on my own which does the following: * Strangers can apply and there is an interface such that the admin can accept or dismiss applications * Each accepted member has his own profile where he can advitse himselft for the world wide web * Members can be sorted in different categories, they can post advertises that will be available at different places on the side, and members can also buy certain things in the memberarea. I also create my own bills with fpdf. The reason why I want to switch to a CMS is the following: * I want to have the website in many languages and I want for instance that some german members are able to change the german content of the website(not only there profile) and, therefore, I need a CMS. What would you advise me to do? Should I get some CMS (like Moode or Wordpress) and should I try to integrate my own code in this CMS or is there already some CMS that I can buy that has these features that I have listed above?"} {"_id": "12277", "title": "Looking for a CMS to publish online tutorials", "text": "> **Possible Duplicate:** > Which Content Management System (CMS) should I use? I want to create a site with tutorials on a specific topic. The tutorials would include code samples (so I need code highlighting), pictures, screencasts, and files to download. I would like to edit the pages easily, like on wikis, but also allow for comments like on blogs. I don't want wiki-like community editing, and I want to update a given tutorial page continuously, like on wiki, but not like in the blog. So which CMS should use?"} {"_id": "10600", "title": "Subscription Site CMS", "text": "> **Possible Duplicate:** > Which Content Management System (CMS) should I use? I was wondering if there are any subscription cms's out there which provide good functionality? I am looking to have a site which has different levels of users such as bronze,silver,gold. They will then be able to access specific content on the site. So articles and videos which are only accessible to certain members. Thanks"} {"_id": "44222", "title": "what is the best CMS for member login and job posting", "text": "Hi I have worked in wordpress and joomla but i am confusing to choose CMS.. which one is suitable for my requirements. Please read requirements and suggest me some CMS. Thanks in advance Directory which will list all of our members, with the option to sort by company name and category (eg manufacturer, supplier, developer, EPC, legal, accountant, education, government etc). Some members will fall into one or more of these categories. The directory should contain the following: Name of company Contact person Phone number Mobile number Address Category Membership Type Members should be able to add/edit their details so each member will need to be given a log-in and password. Admin should approve/manage the memberships. B) Job Posting Job posting section \u2013 members can post job opening"} {"_id": "53707", "title": "Does anyone know a good (video game) matchmaking CMS?", "text": "I'm looking for a good CMS that can work as a matchmaking CMS. Particularly for video game ladders and tournaments. Currently I work on Magneto and Pinnacle cart, and obviously I'm not trying to sell anything with my new site, but rather fill a niche that isn't being offered for this video game. That being said, does anyone know of a CMS that is pre-made for this type application? Or am I stuck modifying a Joomla, or dating site CMS?"} {"_id": "16862", "title": "Looking for a very lightweight PHP CMS for single page websites", "text": "> **Possible Duplicate:** > Which Content Management System (CMS) should I use? I need a very simple, secure CMS for single-page websites, with a single page backend, multilanguage, and preferably XML based, although that is not so important... I know there are other similar questions, but most of them are quite old. Please don't suggest Wordpress, it is definitely overkill for something like this."} {"_id": "53325", "title": ".NET open source free CMS for millions of pages", "text": "I have done extensive research on the umbraco cms system. I have concluded that it is not suitable for web sites that will contain millions of web pages. When loggin into the back office the software must load all the pages which takes a long time and may even time out with millions of pages. is there a good cms built on the .NET platform that can handle this?"} {"_id": "22438", "title": "Blog/CMS software with editing style like Stack Exchange", "text": "> **Possible Duplicate:** > Which Content Management System (CMS) should I use? Are there any pieces of content creation software for making a blog that support an editing style like Stack Exchange and Stack Overflow? Or magic combinations of Wordpress plugins that offer the same? I have been updating a Wordpress blog lately and found the turnaround time for content creation and editing is much worse than for Stack Overflow posts. Part of this has to do with being original compositions rather than riffing off a question. But part of it is the software. I am looking for CMS/blog software that has an overall editing experience similar to Stack Overflow. The most important features I'm looking for: * Inline editing * (mostly) Real-time preview on the same page are all important features for speeding up data entry. * Markdown support (with inline and block-level code support) * Syntax hilighting The features I must maintain from my self-hosted Wordpress: * Somewhat popular/supported software, with extensibility support * Self hostable * Will work with MySql Wordpress has plugins for all these, but they don't necessarily work together. For example I've found a few markdown-on-save plugins, but I doubt those have a chance of ever supporting inline editing or real time previews. Also the most popular syntax hilighting plugins don't support inline code blocks, and I doubt previews would work with other syntax hilighting methods. If I get a wiki/web page content creation system along with it, or somehow integrate this into GitHub (with all the features I requested) I'll accept those as side benefits."} {"_id": "24351", "title": "simple cms for one page portfolio", "text": "> **Possible Duplicate:** > Which Content Management System (CMS) should I use? I want to use a simple cms for a one page photography portfolio. Here are some screenshots of the website: homepage and interior page Basically I have just to insert a thumbnail image for index and a specific page for each thumbnail that contains a title a description and some images. I don't want to use Wordpress, I think is too much. Could you recommend me some other cms for this type of portfolio?"} {"_id": "48329", "title": "Which FLOSS Python CMS for video sharing in a school?", "text": "I have an university project in which we develop a **video sharing system for schools**. Our central use cases are as follows: * The system has to **manage videos** (and associated arbitrary metadata) Where management includes uploading, transcoding, browsing (by metadata) and playing * The system has to implement a **3-tier release process** : * pupils may add videos to the system * teachers may approve added videos, publishing them to other pupils * administrators may revoke the approval of published videos The system needs to be developed in an agile fashion, oriented on Extreme Programming. As a result, the system needs to be implemented in a programming language, the developers are familiar with. The biggest overlap in competences of the developers is Python, so with additional functional requirements to be expected: * The system needs to be implemented and extensible in **Python**. The system needs to be released to the general public, so * The system and all components need to be licensed as **open source**. Which extensible open source Python CMS, supports video content and a role- based workflow?"} {"_id": "44861", "title": "Looking for a Social Bookmarking Content Management System", "text": "I want to make a website like Reddit, I already tried Pligg, Drigg, Elgg but the codes were too messy and impossible to customize. I really like WordPress and Joomla but they are not designed for that, I am wondering any suggestion or plugin to make them like Reddit or other cms?"} {"_id": "2006", "title": "Strengths of various open-source PHP Content Management Systems?", "text": "> **Possible Duplicate:** > Which Content Management System (CMS) should I use? What are the strengths / weaknesses of the various PHP-based open source CMSes?"} {"_id": "11066", "title": "Blogging platform for private/family blog?", "text": "> **Possible Duplicate:** > Which Content Management System (CMS) should I use? I am trying to setup a blog to be used for our family. This blog will not be open to the public and family members will require a username and password to enter the site. I had originally settled on Wordpress with some content restriction plugins but I am beginning to question that decision. So many Wordpress features are not relevant to a family blog. Items like Trackbacks are confusing to users and still show up even when trackbacks are disabled for a specific post. To that end I am looking for recommendations on open source blog engines that might fit the bill. I run IIS and Apache so either platform is fine. Thanks."} {"_id": "11687", "title": "i need a cms for book online reading and sharing like slideshare.net site", "text": "> **Possible Duplicate:** > Which Content Management System (CMS)/Wiki should I use? i need a cms for book online reading and sharing like slideshare.net site"} {"_id": "26144", "title": "What free website system do you recommend to promote intepretation/translation services", "text": "> **Possible Duplicate:** > Which Content Management System (CMS) should I use? I am a business interpreter based in Hangzhou,China. I am considering to mount an independent website to promote my service. I will register a domain name for it. My question is, what free website system do you recommend? The website system needs to include a content management system so I can add information."} {"_id": "10681", "title": "Best wiki engine to use?", "text": "> **Possible Duplicate:** > Which Content Management System (CMS) should I use? I'm looking to set up a wiki as a simple CMS for a resource page. Mostly just pdfs and word documents will be hosted, but the two main features I'm looking for is the ability to restrict pages based upon user privileges and for blog- style comments between the users. From what I've researched, mediawiki can easily do the first part with restricting users, but I haven't had much luck finding any plugins for comments. I'm trying to avoid the discussion style pages from wikipedia, and have more of a comments just under the article. So far I'm leaning towards trying Tiki out, any other recommendations?"} {"_id": "8513", "title": "I need a multi-language site with webshop functionality. Which CMS to choose?", "text": "> **Possible Duplicate:** > Which Content Management System (CMS) should I use? I need to develop a multi-language site which includes simple webshop functionality. I have extended experience with WordPress. There are numerous shopping cart plugins available for WordPress however none of them is compatible with multi-language plugins such as WMPL. Drupal is an option I looked into (using i18n and Ubercart) and I am not sure this is the solution I am looking for. Another solution I considered is to develop a custom WordPress cart plugin that is compatible with WPML. Anyone familiar with this situation? Any recommendation regarding CMSes that fit my needs? Thanks!"} {"_id": "24987", "title": "Photo contest plugin", "text": "> **Possible Duplicate:** > Which Content Management System (CMS)/Wiki should I use? Consider this template I would like to build a similar site looking site which allows users to register and upload their images. Each image should have title and short description, provided by the uploader, and be immeaditely available on the site once upload is complete. Any suggestions?"} {"_id": "14878", "title": "Django plugable news story application", "text": "> **Possible Duplicate:** > Which Content Management System (CMS) should I use? Are there any plugable django blog apps that can be embeded into existing pages or templates. It doesn't have to be a full fledged blogging engine. The features I am looking for are: * Displaying multiple stories with 'read more' button. Display should be like news items on a news-agregator, without the regular features of blogs like commenting or social networking buttons. * Url for each story * Support for pagination to limit the number of stories that can apper on the home page. * Loadable on to the existing pages or templates. If there are no such apps, suggestions for how the modules (views, models, urls) can be written are welcome."} {"_id": "34946", "title": "CMS without templates", "text": "> **Possible Duplicate:** > Which Content Management System (CMS)/Wiki should I use? I am looking for a CMS where I can layout the page from scratch using HTML/PHP/CSS and simply enter code such as:- FOR EACH (listOfArticles) SORT mostRecent CATEGORY news LIMIT 5
{title}
{body}
END to get a list of the five most recent articles of a certain category in the relevant place. Does such a thing exist anymore? Unless my mind is playing tricks on me, the CMSs of five or ten years ago had this approach. I am thinking of MovableType and the now defunct CityDesk. It seems to me that CMSs these days have a 'templates first' approach. I.E. you must always choose a template before doing anything - which I find really painful. Learning how to design these structured templates also seems overly painful. So can anyone help me in my quest? Thank you, Mark"} {"_id": "17959", "title": "Secure enterprise wiki technology?", "text": "> **Possible Duplicate:** > Which Content Management System (CMS) should I use? I'm looking for a secure wiki technology that will allow shared note taking and documentation. Security, organization, and revision control are critical. Any advice will be appreciated!"} {"_id": "57119", "title": "non-invasive database content management?", "text": "I work for a group that has an already existing, built-from-scratch LAMP (plus javascript) website. There are currently no content management systems in place, and up to now the method for updating content has been for one of us more database savvy employees to simply manually update the database through mysql workbench, etc. I'm looking for a solution that would allow us to customize an interface for non-developers to update certain content that lives in our mysql database. The problem is, I don't want it to also have control over our page templates and front end styling, as that is already all in place and incorporates many complex web apps that could be difficult to incorporate into a site managed by a full CMS such as Drupal. cushyCMS has functionality very close to the scope I am looking for, but it deals in direct editing of specific html files, whereas I am searching for something that would allow users to edit specific parts of a database. Ideally, I imagine a drupal-like interface for user editing/input (ie drop- down menus, limited functionality so people can't break things, etc.) but applied to a pre-existing database. Anyone caught any wind of something such as this?"} {"_id": "6912", "title": "background css help", "text": "> **Possible Duplicate:** > image behind adsense i have a google ad, where i want to place an image, i tired the css, but the image is not coming properly, my google ad size is height=280, width=336 the image url is http://www.clker.com/cliparts/e/p/V/F/o/0/notepad-hi.png i want this image behind my ad, i tried this following code:- { background:url(\"http://www.clker.com/cliparts/e/p/V/F/o/0/notepad-hi.png\")transparent; height: 300px; padding: 50px 30px 0; width: 350px; }"} {"_id": "27378", "title": "How to ensure mysql starts automatically if server restarts on Ubuntu Server", "text": "Occasionally for various reasons my webhost 'reboots' my VPS after doing some routine maintenance or upgrades. However when they do, or if I reboot the server for any reason myself, mysql doesn't restart. To restart it I need to ssh in and run... service mysql start How can I make it start automatically like the other programs my website depends on to work properly. Otherwise users of my site see the site down until I can get in and fix it. Is this an issue with Ubuntu Server or is it my vps host?"} {"_id": "48307", "title": "Adding an RSS feed to a Facebook page", "text": "I want to add a client's blog feed to a Facebook page. Is there a way to do that without using a third party app?"} {"_id": "54248", "title": "Changing to PHP 5-5.5", "text": "I've been using **`PHP-5.4.3`** for about a month now, and today, I decided to make a switch to **PHP-5.5.5** , I downloaded the source code, and placed it in `C:/php` ( _also renaming the folder`php-5.5.5` to `php`_) and I added server variables, as usual `C:/php/` _< -- but here I got stuck_. Because, usually I appended the `php.exe` at the end, which was found inside the PHP folder so, I could be able to access PHP from the command line, or start the built-in server, but now, I can't find this file, and I can't find a way to start the server from the command line either. Anyone help."} {"_id": "53908", "title": "Change folder name in server", "text": "If I change a folder name in server ant thus the URL will be changed, could I perform 301 redirection or this is only for the case that the web page has been transferred to another folder?"} {"_id": "56123", "title": "Structured data and visible content revisited", "text": "Structured data (like schema.org markup) should traditionally be used on visible content (a search on here will confirm that) and Google and others have said as much. However, schema.org is now supporting JSON which would never be visible on- page. Additionally, and for quite some time, the Good Relations Snippet Generator has created snippets that are empty divs with instructions to post it before the closing `` tag of your webpages. Does anyone have practical experience (even better, A/B testing) of the results (or problems) from using Semantic Web markup on visible vs. non- visible content or any insight/guidelines?"} {"_id": "68489", "title": "
in meta-description, ok or not?", "text": "I have a dynamically constructed page that automatically pulls specific page content and applies it to the meta description. There are `
` tags that end up in the meta description. Should I write the extra code that will remove and store this cleaned-up version, or is it ok/safe/bad-for-SEO to leave the tags there?"} {"_id": "49740", "title": "Suddenly website stopped to appear at Google Search", "text": "We have a working website for more than a year. Everything was ok, site worked, appeared in the relevant Google searches etc. Now, one day we discover that the site **totally** removed from Google Search results. (I don't complain that it's low, not that). It just never appears, even if I search Google for the site name itself, it does not shows. The site subscribed to Google Webmaster Tools, Google Analytics and AdWords (currently no Ad campaigns are running). All these tools show no single warning. We received no complain email or anything. I don't quite understand what could be the reason or is it a bug in a Google (?!) Bing does not show the website either, but we never checked bing before. Site does appear on Yahoo Search. On `ask.com` it does not appear. Any clue would be helpful. _One thing that I can think of - we are a dynamic DNS provider and we give to our users, domains like`user.website.com`. It is possible that some user may use that domain for some malicious activity, like sending spam or running a malware website or anything else. I know that some antivirus (I think avira), probably because of the aforementioned reason, marks our website as a malware site._ The site in question is `https://www.net-me.net`."} {"_id": "56129", "title": "Google Analytics regards direct access as referral for subdomain", "text": "When I directly access `subdomain.mydomain.com` by entering it in the browser's address bar, Google Analytics Realtime Overview indicates that top referral is `mydomain.com`. I'm sure that I'm the only visitor of my website at that moment (It is just one visitor from my location). I expect that it should indicate for Top Referral: `There is no data for this view.` What is the cause of this behavior?"} {"_id": "56128", "title": "Prevent Google Indexing SubDomains", "text": "I have cPanel Hosting with only 1 IP. I'm in the process of pointing all my TLDS over to this server. I have created 'Addon Domains' and all the folders from which it creates are located in sub directories 'domain1.com' , 'domain2.com' in my root website '/home/user/public_html/client/' but they can be accessed from 'domain1.maindomain.com'. The thing that is worrying me is if google indexes these subdomains. What can I do to prevent this, would a rewrite rule like this be sufficient? RewriteEngine on RewriteCond %{HTTP_HOST} !^www\\.domainpointtosubfolder\\.com$ [NC] RewriteRule ^(.*)$ http://www.domainpointtosubfolder.com/$1 [L,R=301] Or is there a better way?"} {"_id": "57290", "title": "How to manage a site trying out different technologies?", "text": "I'm a complete newbie to web hosting. I only have programming skills, but I'd like to acquire a domain. Then I'd even like to have the chance to manage different technologies for hosting web sites and applications. I mean, I don't only want to try the Apache, PHP and MySQL stack but try out with node.js and Ruby on Rails. These might be subdomains and each correspond to different technologies. What is the suggested setup for using so many technologies?"} {"_id": "54219", "title": "In Google webmaster tools, can a \"soft 404\" be triggered by the text on the page?", "text": "I just ran across an error in Google Webmaster Tools that I have never seen before. I manage the website for my local community band (I play trombone). One of the pages on the site is a list of our upcoming performances. It is powered by a WordPress events plugin that uses a database of upcoming events that are entered through the administration interface. We just finished up our summer and fall concerts and our next performance will be our Christmas concert. I hadn't gotten around to adding that into the website yet, so there are no upcoming events shown on the page. In fact the text on the page says: > No upcoming events listed under Performance. Check out past events for this > category or view the full calendar. Then in Google Webmaster Tools, this page is showing up as a \"soft 404\": ![google webmaster tools soft 404](http://i.stack.imgur.com/sssFa.png) The page is returning a 200 status and Google is indicating that he 404 is \"soft\". I wouldn't have expected Googlebot to be as sophisticated to parse that particular sentence. Is Googlebot able to detect that the text on the page indicates that there is currently not content and then treat it as a 404 page because of that? If Google is treating this page as a soft 404 because of the text on the page, does that mean that like regular 404 pages, the page won't show up in search results?"} {"_id": "57292", "title": "Returning visitor stats reliably over longer periods of time in Google Analytics", "text": "I'm looking into how useful returning visitors stats in Google Analytics are over longer periods of time, such as two or three years. It seems that the `__utma` cookie is set to expire 2 years from set/update, so this suggests that this cookie should be updated each time the user visits the site, and the 2 year expiration reset. Would I be right in thinking then that it's reasonable to expect the returning visitors data to be no less accurate over longer periods than it is in the last year, for example?"} {"_id": "42466", "title": "Weird 404 crawl errors in Google webmaster tools", "text": "If anyone can help, I'd be very grateful! We're getting strange domains coming up in Google webmaster tools - they seem to be breeding. When I look where the links are coming from, it's either: * non-existent pages on our website (usually) or * some other site URL I'll give some examples: * hechenghai/article/enquiry/frmenquiry.aspx * sql-server-reporting-services-training/~/ms-project/~/stored-procedures/~/enquiry/frmenquiry.aspx * blogs/BradSchacht/ssis/blog/blog/enquiry/frmenquiry.aspx I'm sure Brad Schacht is a great guy, and hechenghai a great place, but I'd never heard of either of them till they turned up in Google tools, and they're certainly not on our website. Some things to help: * I have a simple site map which Google knows about, and have checked this. * We don't use an htaccess file (it's all in Windows) * we don't use any illicit or black-hat techniques * we generate the site from a .NET system which writes out the HTML pages You can see the site at http://www.wiseowl.co.uk"} {"_id": "25289", "title": "Is there a way to compile LESS files to CSS (so that every browser doesn't have to)?", "text": "I'm considering using LESS but I cringe at the thought of a browser having JavaScript disabled and my CSS not being readable (and the site looking awful). Also, it offends the engineer in me to force every browser to \"compile\" that Less file to .css when we could just \"compile\" it once every-time we modify it."} {"_id": "57298", "title": "How many users use geotargeting in while searching in Google", "text": "Are there any stats about how many Google/Bing users may be selecting country (as in the pic) in the SERP?![enter image description here](http://i.stack.imgur.com/O7EOM.jpg) As Susan says in here that geotargeting in GWT only works when users select a country as in the pic above. If it's small percentage then there won't be much use geotargeting wesite/pages by Webmasters."} {"_id": "25287", "title": "Is LESS ready for use in a production site?", "text": "I'm considering using the Twitter Bootstrap HTML/CSS templates which use Less (a sort of scripting language for CSS) I've never used it so I was wondering if it's ready for \"prime time\"."} {"_id": "52863", "title": "image map does not work on Opera and Mozilla browsers (only Google Chrome)", "text": "I have created a WordPress blog and inserted an image with map and some links inside, but I can't open any link by pressing on the page using any browser except Google Chrome. This is the code: \"\"

\"\" \"\" \"\" \"\"

"} {"_id": "25282", "title": "How to track click sources in a Google-compatible way?", "text": "On my website, I track when users click on a search result that brings them to a details page for an item. I then save what search query they used before they clicked. My current solution is this: 1. Each link in search results is in the form `/goto//?search_id=...` 2. `/goto//` saves that a user used a given search id to get to a given item, and then returns a 302 redirect to `/details//` 3. `/details//` displays the details page and does not do any tracking. For the users, everything works fine, but when I check Google search results for my page, the direct links to the details page URLs say `/goto//?search_id=...` with some old `search_id`, instead of `/details//`. I feel like I'm missing an obvious solution :) The only thing I came up with so far is using `/details//` links in HTML and using JavaScript to replace them all with `/goto//search_id=...`, but that seems like an overkill. Any better ideas?"} {"_id": "67164", "title": "What is the potential harm of using Google's Disavow Links tool?", "text": "Google gives a warning on its disavow links tool, that says: > This is an advanced feature and should only be used with caution. If used > incorrectly, this feature can potentially harm your site's performance in > Google's search results. We recommend that you only disavow backlinks if you > believe that there are a considerable number of spammy, artificial, or low- > quality links pointing to your site, and if you are confident that the links > are causing issues for you. What exactly is the risk of using this? If I have a bunch of spammy links that a competitor built, would there be any potential harm of disavowing them? EDIT: In my particular case, my main site has been delisted from Google's search results, and at approximately the same time, I noticed that I've started getting a lot of spammy links to my site. No warning has been issued by Google in webmaster tools, but I want to disavow the spammy links anyway."} {"_id": "32013", "title": "Adwords traffic showing up in Analytics as organic traffic, not paid?", "text": "I've got Adsense ads running driving PPC traffic to my site. When I go into Google Analytics under the tab Traffic Sources > Search > Organic, I get all the search terms people have come through for. I can't tell if they're from organic or paid search, but I believe they are paid as I'm not ranking organically for the terms that I'm getting traffic through. If I go under Search > Paid I get nothing. They are linked via > Tools and Analysis > Google Analytics. Any idea what might be happening?"} {"_id": "16252", "title": "can mod_rewrite be used for this problem?", "text": "We used to have a URL `http://www.abc.com/index.php?itemID=144` which is moved to `http://www.abc.com/index.php?itemID=1556` we want our users which are hitting the above url(144) to reach to 1556. How can it be achieved. If with mod_rewrite or anything else."} {"_id": "16251", "title": "Should Site Title be Before or After Page Title?", "text": "Apologies if this is a dupe. I tried searching, but didn't find anything specifically addressing this concern. When creating a large(ish) site, page titles usually reference both the site name and the current page name. However, it seems there are two main conventions: Bob's Awesome Site - Contact Page and Contact Page - Bob's Awesome Site I've looked around, and pages usually use one of the two variants above. **Is there any reason to use one over the other? SEO/readability/usability/etc?** I've thought about it, and have only come up with: * Page first - Differentiates the tab when the browser is crowded with lots of tabs * Site first - Immediately see the \"parent\" site, so to speak; more cohesive experience"} {"_id": "273", "title": "What is the easiest/lightest setup to get a basic LAMP stack setup for development?", "text": "I'm looking for a minimum of fuss here to get up and running. Bonuses: - cross platform - portable (can be installed/run from a USB) _Clarification: I'm not looking to setup a full-fledged remote testing server, I just need something simple that I can load localhost in my computer's browser and check my latest changes._"} {"_id": "18110", "title": "Shared Host VS Cloud server", "text": "> **Possible Duplicate:** > How to choose between web hosting and cloud hosting? What the difference between a shared host and a cloud server. I have a url http://domain.com and with a shared host, easly I have FTP details where I can upload everything on the server. Is is the same with cloud server or is it the same as amazon Cloudfront where you haven't got FTP details etc? What are the differences in terms of speed? Thanks alot"} {"_id": "16256", "title": "Anonymouse versus logged in users on my site & Google Analytics", "text": "I'd like to be able to run two different 'tracks' for Google Analytics; One for anonymous users of the site and another for Users whom are logged-in. I say \"track\" because I'm not sure of the term -- but I definitely know I want it to all be in the same \"Analytics Account\", I just want to segregate my logged-in users. In the site template, I can very easily add a conditional to display one or the other (Analytics code snippet)... Which I'm hoping this comes down to and although I'm not sure, it seems that the last digit in your Analytics ID (e.g. `UA-15XXXX0-X`) could be incremented to gain such additional 'tracks'. My current footer snippet: "} {"_id": "32015", "title": "Is the Title attribute still used?", "text": "I'm curious if anyone still uses the Title attribute? I've noticed that none of the sites I visit on a regular basis use it at all, I'm talking about sites like ArsTechnica, The Verge, the BBC, etc...big triple-A sites. I found one random reference on the W3C's site to it being abused, is it just out of favor now or is there some specific reason?"} {"_id": "16254", "title": "List of usage information to collect in a web application", "text": "I'm writing a web application that will allow people to create accounts, edit stuff, send stuff to people, &c. I plan on recording things like when things were created and sent and stuff. Is there a list of usage information that one should collect in a web application? I'd like to see whether I'm missing something. Also, is there a list of usage information that I shouldn't collect (Like maybe information that people find private)? EDIT: I was thinking about information that is collected automatically (like timestamps and ip addresses) rather than information that is entered manually."} {"_id": "24379", "title": "Is it possible to add a logo to an existing image file?", "text": "Is it possible to add additional image to an existing image using php or jquery?The logic is same as watermark in PHP but I want the logo to be draggable so a user can position the image anywhere on the existing image."} {"_id": "16706", "title": "Moving to a new domain - what to do to minimize visitor loss", "text": "I want to move my website to a new domain. Would be nice if you can share some experience or suggestion to minimize loss of visitors. In google webmaster tools there is an option which allows to move search result to a new domain. It should be done through a 301 permamently redirected page? Can you tell me what I have to do for this? do I need to put this 301 page in every page I had in old domain? More ideas are welcome. Thanks."} {"_id": "16700", "title": "Blocking URLs and canonical questions", "text": "We are running a Magento store set up and was looking to block all pages with the exception of a select few. It seems the only way we can do this is with blocking direct paths to files that are in the root and then wildcarding various other diretories. My question is if we have a product like website/ultragloss-black.html the canonical is actually /zurfiz/ultragloss-solid-colours/ultragloss-black.html If we block /zurfiz/* will this block the short url version as the canonical is in the /zurfiz/ directory?"} {"_id": "50352", "title": "Can you put content from your site on Facebook, or will that cause duplicate content?", "text": "I have a question about duplicate content. Is it okay to take a paragraph or two directly from your website and put it on Facebook page as a post say, about your products or services, or even your about us page? Does this harm your website ranking, have no impact or is it not worth doing because it may be problematic but there is no clear consensus? A coworker and I have been discussing this where I admit to being more concerned about it than they are."} {"_id": "68888", "title": "My website Internal links are no-follow", "text": "I have a website and all the link showing no-follow is it good for SEO purpose? or I want to make it do-follow what should I do now? Actually my website is access with two version `WWW` or without `WWW` and it was also access `index.html` but now I edit _.htaccess_ code file and and also do redirect (301) for the page `index.html` now website is accessing with `WWW` version and `index.html` is also redirect to home page but problem is that all internal links are showing no-follow please help me with the _.htaccess_ code. What should I do now? For more info find the snapshot of the _.htaccess_ code and write the code for me. ![enter image description here](http://i.stack.imgur.com/F4mG4.png)please find the images for more info-![enter image description here](http://i.stack.imgur.com/6E5RG.png)"} {"_id": "12019", "title": "Page doesn't show up in Google searches", "text": "When I search on Google for \"pollackstorch\" I get an invitation to the administrator rather than the page itself. When I search on Bing or AltaVista I get the page itself. How can I get Google to return the page itself like the others do?"} {"_id": "15909", "title": "Does Google Analytics data affect SEO?", "text": "I have several pages on my site that I did not include the Google Analytics code. Will it improve my site's Google ranking if I include the GA on all of my pages and increase the number of visits on GA?"} {"_id": "55116", "title": "Include latest searches in search engines index", "text": "My websites generally include a page where you can publicly see what are the latest search terms used by the users of the web. I know it's not a good security practice to allow this since you can find undesired content. On the other hand it boosts the number of pages indexed since every new search can provide a link on Google and people can find you with related keywords that you are not using on your web page. What is the rationale behind including or excluding this results in search engines index ?"} {"_id": "12014", "title": "'Helvetica Neue' webfont", "text": "I am trying to figure out where do all sites that use the 'Helvetica Neue' font get it from. There are certainly too many sites using it for it not to be free. But frankly, there are only two fontsites which have it in their catalog, and it is for sale (and not as a webfont). Moreover, can you give some examples of **webfonts** which are similar to this 'Helvetica Neue' and are (mostly) free? Thank you!"} {"_id": "12017", "title": "Select cms to the construction site with e-learning", "text": "I want to suggest me someone who has experienced one cms or component to build website with e-learning"} {"_id": "12016", "title": "Is it possible fight against domainers?", "text": "these days I have looking for a domain for my web application, but I've realized that the 99% of them are bought and of these the 9/10 are from domainer speculators. \"Welcome to the real world\" maybe you think, but I'm wondering if there is anything that we can do to \"fight\" against this situation."} {"_id": "12010", "title": "Global Statistics for Browser + Operating System", "text": "I'm trying to determine global statistics (similar to StatCounter's) for a combination of Browser + OS. For example, 35% (or whatever) of users are on XP **and** IE8. StatCounter only displays one or the other (ex. 47.32% on XP, 30.20% on IE8). Is anyone aware of a site that offers this information - preferably free of charge - or a way to glean it from StatCounter or others?"} {"_id": "52468", "title": "Is there a problem in having same product with different names in different pages?", "text": "When ot comes to structured data, schema.org for products, Is there a problem in having the same product with 2 different names in 2 different pages for layout reasons? Example: Category page with many products. Objects appear in smaller divs that don't fit complete name vs product page totally dedicated to one product that fits all the information. **Category Page:** Dell 30\" Monitor **Product Page:** Dell UltraSharp 30\" LCD Monitor Thanks"} {"_id": "17259", "title": "How should I prepare the design of a web page for a web developer?", "text": "What techniques, software or practices do you use to prepare a description of a web page for further development? I am doing some research (with little luck) in how to create description for web developers - what should be included on the web page (inputs widths, font sizes, images placement, etc). Right now I use a combination of Excel and Word documents. In complex cases this is inefficient. Any other suggestions?"} {"_id": "26397", "title": "Redirect *.example.com to example.com", "text": "I have some valid subdomains in my httpd.conf file. Now how do I redirect anything that's not a valid subdomain to the main website? What I mean is, if I have a valid subdomain hello.example.com, it should go to hello.example.com and if its blah.example.com which is not a valid subdomain, it should go to example.com. How can I make changes in my httpd.conf file to follow everything I just mentioned?"} {"_id": "26396", "title": "SEO - advice on costing and setting deliverables", "text": "I've some experience of doing on-site SEO work for clients for whom I've been developing their website at the same time. It's always been added in as part of the overall package and the costing for the SEO work has never been explicitly separated from the main budget. The results have always been good, with me basically following best practice with on-site optimisation. I deliberately don't get involved with the off-site side of things. Anyway, without rambling on too much, I have a situation where I've been asked to cost for on-site optimisation work for a new client - but I'm worried about how to qualify what is an achievable result. The site has around 30 pages, none of which are optimised at all currently. The market they operate in is pretty competitive but I'd be confident of siginificantly improving their SEO ranking. How would deliverables be structured for this type of work? Do you promise that you'll get the site to a minimum of position n on Google? Also, out of interest, does anyone have any basic ballpark figures for what this type of work would typically cost for a site of this size? I'm looking more at a one off fee as opposed to an ongoing monthly service. Grateful for any pointers on any of this!"} {"_id": "26393", "title": "Ad Block is blocking some images, excluding others, on a site I host", "text": "I've created a website that has political banners (in .GIF and .PNG) with the intention of people being able to copy and embed them with the code provided below the image. Strangely, 3 of the 10 banners weren't appearing and I discovered that my Ad Block extension was treating those as ads, but none of the rest. They are all sourced from the same file path, they all contain relatively the same content, just different sizes. It would be my preference to not have these treated like ads whatsoever, but obviously that is not a decision for me to make. I'm curious as to why some are being excluded over others, though. If these are considered advertisements by the extension, it seems silly to me that as the 'source of advertisement' and host of this site, I can't simply provide something for others to share when they are voluntarily there to see just that. If this is something I have to eat for the greater good of preventing spam, then that's fine. I'm just curious to see if I have any options. Many thanks PWM"} {"_id": "30700", "title": "SSL on multiple directories", "text": "I have a website that is http, but has a port set up for https for a specific directory that is for the shopping cart. Now I'd like to use our SSL on a different directory in the same site as well. How can I go about configuring that? I have tried looking into all of the config files as well as the docs and cannot figure it out. I tried setting this in httpd.config as well #also tried *:[ssl port] and [actual IP]:[ssl port] ServerAdmin shred@me.com DocumentRoot \"C:/path/sslNeededDir\" ServerName www.example.com hoping that it would cause pages in this directory to use ssl port and become https, but that didn't do anything. Any help is greatly appreciated. **Update:** This finally got migrated over. I'm still looking for a solid answer on this. If anyone could help me, it would be greatly appreciated."} {"_id": "69053", "title": "SEO: how to organize a multiple, etherogeneous topics website?", "text": "I happen to have a decent knowledge about a ton of very different topics. I'd like to start a **large** blog website talking about them, but while some of those topics are more or less close, others are really alien to them. In example, I'd like to post a series of blog posts about how to optimize wordpress, another bunch would be about OpenCart. But I also know a good deal about configuring Ubuntu servers so I also want to create a series of posts about that (by series I mean 100 or so). But I also know a lot about finance and markets trading, I could write thousands of posts about that. But I also know a lot about online gaming (and coding), I could write a number of posts and make videos and so on. I have read that the best way is to keep everything under the same domain so to build \"critical mass\" which in turn brings Google rank. But wouldn't so many topics diversity actually convince readers that the website is amateur- ish or too much \"do it all\" / unfocused and actually lead them to abandon it? Also, I read that a best practice is to keep only one consistent theme across the whole website. I find it hard to setup one theme which is effective and good for such diverse stuff as wordpress and gaming or finance. I don't have problems implementing different themes as I got the full Genesis bundle. As of now I had started creating empty websites focused on one task (i.e. www.wp-optimize.com for wordpress optimization) but I fear I'd easily end up creating 7-8 unrelated small websites that would never gain momentum to rank up. Basically, I have the content, I have the quality (well, I hope so!) and even have some quantity. I am \"just\" asking, with this peculiar situation on my hands, if there are best practices to monetize my efforts (with ads) in a rational and effective way."} {"_id": "52463", "title": "Is it a good idea to make \"safety copy\" of articles from referring sites?", "text": "My wife has a small business with a website, of which I am the maintainer. (I am not a trained web developer, but know the basics of HTML and CSS.) There are already a few articles / blogs / news items about them online on other sites. She would like to keep a list of references to these. However, she would like to ensure that the articles are available even if the other site removed the original content or migrated it to another URL. So she somehow would like to store a copy of these articles on their own site. This raises a few questions to me. First of all, is it a legal and accepted practice in general (assuming we have the original owner's consent)? I know the legal part is country specific and I should rather consult a lawyer. At this point I am more interested in whether Google and other search engines penalize this, and how to play by the rules if possible."} {"_id": "17254", "title": "Sponsored blog or explicit owned?", "text": "I'm trying to convince my client to use a blog to communicate with his audience and promote his site. However, I'm still unsure whether this blog should explicitly advertise the brand as in Nike Blog or appear discretly as a sponsor, should anyone read the 'about us'. The reason I'm concerned is because the purpose of this blog is to produce articles about subjects that is of the brand's audience interest, not an institutional blog that would eventually publish news, promotions and what not. There's already a news bulletin for people who are directly interested in the brand and we're trying to reach another audience: the people who'd probably leave if they noticed the blog is institutional."} {"_id": "30703", "title": "Check-o-matic? is there such a thing?", "text": "Several insurance companies use a service called Check-o-matic , where they deduct from the users checking account. Has anyone seen a service like this? It would be a competitor to paypal. I know paypal with deduct from a checking, but they are looking for a specific service. I don't know if one exists, but all the insurance companies call it the same, Check-o-matic."} {"_id": "17256", "title": "How can a website survive an untrusted environment?", "text": "Let's say I have an internal website that holds pretty important data (well, contest strategies for stuff like FIRST), which I am bound to host on a Windows server with literally no security. I can't access anything but the public directory. We have script kiddies from nearby schools constantly trying to get a competitive advantage by hacking into the servers and stealing information. About 30% sure there's viruses on the system too. Is it possible to build a website that could survive such treatment?"} {"_id": "60091", "title": "Building a funnel retroactively in Google Analytics", "text": "I have a issue with trying to devise a funnel in Google Analytics. I am not looking to create a goal funnel, as I am looking to do it retroactively on my data so I need to either do the crunching manually or devise a strategy for it. I wish to see sessions who have navigated from these pages, for example: / -> Category page A -> Cart -> Conversion / -> Category page B -> Cart -> Conversion / -> Category page C -> Cart -> Conversion How would I go about creating a funnel like this in Google Analytics and extract this data **retroactively** (emphasis so I do not get recommendations for setting up goal funnels since I want to do this on already processed data)? I imagine I could use the segmentation sequence, but how reliable is that? I am getting sampled data very quickly and I want to be able to do this for a couple of months back, but I get sampled even after choosing a week. Any ideas?"} {"_id": "63024", "title": "Massive 404 attack with non existent URLs. How to prevent this?", "text": "The problem is a whole load of 404 errors, as reported by Google Webmaster Tools, with pages and queries that have never been there. One of them is `viewtopic.php`, and I've also noticed a scary number of attempts to check if the site is a WordPress site (`wp_admin`) and for the cPanel login. I block TRACE already, and the server is equipped with some defense against scanning/hacking. However, this doesn't seem to stop. The referrer is, according to Google Webmaster, `totally.me`. I have looked for a solution to stop this, because it isn't certainly good for the poor real actual users, let alone the SEO concerns. I am using the Perishable Press mini black list (found here), a standard referrer blocker (for porn, herbal, casino sites), and even some software to protect the site (XSS blocking, SQL injection, etc). The server is using other measures as well, so one would assume that the site is safe (hopefully), but it isn't ending. Does anybody else have the same problem, or am I the only one seeing this? Is it what I think, i.e., some sort of attack? Is there a way to fix it, or better, prevent this useless resource waste? **EDIT** I've never used the question to thank for the answers, and hope this can be done. Thank you all for your insightful replies, which helped me to find my way out of this. I have followed everyone's suggestions and implemented the following: * a honeypot * a script that listens to suspect urls in the 404 page and sends me an email with user agent/ip, while returning a standard 404 header * a script that rewards legitimate users, in the same 404 custom page, in case they end up clicking on one of those urls. In less than 24 hours I have been able to isolate some suspect IPs, all listed in Spamhaus. All the IPs logged so far belong to spam VPS hosting companies. Thank you all again, I would have accepted all answers if I could."} {"_id": "63025", "title": "What to do when we had a backlink (from a gone site) to one of our a 404 page?", "text": "Previously there was a page on our partner's site that had a link to our site. Now, this page and the whole partner's website is gone (404). But this page still exists in our website backlink profile. Do we need to disavow or take any other action regarding this partner's website (page)?"} {"_id": "57478", "title": "I have permission to republish articles. Do I need to use rel=canonical?", "text": "I made a deal with a guy. I will give him advertising on my site for his business. He will give me permission to reprint the articles that appear on his site (he wrote them). I was looking for a lot of content quickly and this gives it to me. But now I'm reading in order to not get dinged by Google, I need to pass any SERP power off to his sit using the rel=\"canonical\" tag in the head of each page. Is this true?"} {"_id": "26120", "title": "Web service for selling intangible goods like file downloads", "text": "I have a product that I have developed over the course of 2/3 years in my spare time and wish to bring it to market. I won't reveal the nature of the product, but it is training based built around PowerPoint and Excel worksheets averaging around 3 to 5mb, and at the moment I have upwords of 10,000 separate files. My question to you all is, does a service exist to allow me to upload all of these, categorise them nicely together so customers can buy them as 'packs' or 'individually', and handle all payments and delivery of the files? Ideally I could use my own domain name aswell. I am UK based but would push to US services if cost differences are obvious."} {"_id": "9259", "title": "Directing from a 1und1 hosting solution, with URLs intact", "text": "I have done this before on GoDaddy without a hitch, but I cannot seem to figure out this particular case. I have a domain space with temporary URL `http://yogainun.mysubname.com` and am hosting the domain name that is to be applied to it at `1und1.de`. Right now I have set it up so that from the 1und1 domain name hosting the address `http://www.yoga-in-unternehmen.de` is frame redirected to the subdomain that I just referred to. But this is not what I want. `http://www.yoga-in-unternehmen.de` is to be the domain. With the frame redirect, URL's like `http://www.yoga-in-unternehmen.de/example-article` do not show up. But this is what I want. With GoDaddy in a similar case, I just turned on DNS and changed the name servers. That worked without problem, but with 1und1 not. Is there something I am missing?"} {"_id": "9524", "title": "Amazon-like ecommerce site", "text": "Hey there, My idea was to make an e-commerce site alot like Amazon. Not exactly cloning it, but since its for a niche market, i need something like it. I was thinking of using Magento or something like that to use it as a base, but I cant figure out how to allow users to: * Sign Up for account, get verified by me. * Allowed to add items, so they can be searchable. * Product Reviews, What can I use to achieve/make this, and what are some suggestions? I can code in PHP and python, thanks!"} {"_id": "3466", "title": "How to have a blogspot blog in my domain?", "text": "I have a blog at http://afsharm.blogspot.com. How can I have this blog and all old posts, comments and templates in my own domain like http://myowndomain.com/? Which specifications this domain must have?"} {"_id": "9526", "title": "A single request appears to have come from all the browsers? Should I be worried?", "text": "I was looking over my site access logs when I noticed a request with the following user agent string: > \"Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.9.2.12) Gecko/20101026 > Firefox/3.6.12\\\",\\\"Mozilla/5.0 (Windows; U; Windows NT 5.1; pl-PL; > rv:1.8.1.24pre) Gecko/20100228 K-Meleon/1.5.4\\\",\\\"Mozilla/5.0 (X11; U; Linux > x86_64; en-US) AppleWebKit/540.0 (KHTML,like Gecko) Chrome/9.1.0.0 > Safari/540.0\\\",\\\"Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) > AppleWebKit/532.5 (KHTML, like Gecko) Comodo_Dragon/4.1.1.11 > Chrome/4.1.249.1042 Safari/532.5\\\",\\\"Mozilla/5.0 (X11; U; Linux i686 > (x86_64); en-US; rv:1.9.0.16) Gecko/2009122206 Firefox/3.0.16 > Flock/2.5.6\\\",\\\"Mozilla/5.0 (Windows; U; Windows NT 6.0; en-US) > AppleWebKit/533.1 (KHTML, like Gecko) Maxthon/3.0.8.2 > Safari/533.1\\\",\\\"Mozilla/5.0 (Windows; U; Windows NT 6.0; en-US; > rv:1.8.1.8pre) Gecko/20070928 Firefox/2.0.0.7 > Navigator/9.0RC1\\\",\\\"Opera/9.99 (Windows NT 5.1; U; pl) > Presto/9.9.9\\\",\\\"Mozilla/5.0 (Windows; U; Windows NT 6.1; zh-HK) > AppleWebKit/533.18.1 (KHTML, like Gecko) Version/5.0.2 > Safari/533.18.5\\\",\\\"Seamonkey-1.1.13-1(X11; U; GNU Fedora fc 10) > Gecko/20081112\\\",\\\"Mozilla/5.0 (compatible; MSIE 9.0; Windows NT 6.1; Win64; > x64; Trident/5.0; .NET CLR 2.0.50727; SLCC2; .NET CLR 3.5.30729; .NET CLR > 3.0.30729; Media Center PC 6.0; Zune 4.0; Tablet PC 2.0; InfoPath.3; > .NET4.0C; .NET4.0E)\\\",\\\"Mozilla/4.0 (compatible; MSIE 7.0; Windows NT 6.1; > WOW64; SLCC2; .NET CLR 2.0.50727; .NET CLR 3.5.30729; .NET CLR 3.0.30729; > Media Center PC 6.0; MS-RTC LM 8; .NET4.0C; .NET4.0E; InfoPath.3)\" The request appears to have originated from 91.121.153.210 - which appears to be owned by these guys: http://www.medialta.eu/accueil.html I find this rather impressive - a request from _'all'_ user-agents. There's actually quite a few of these requests over at least the few days - so it naturally piqued my interested. Searching Google simply seems to produce a very long list of websites which make their Apache access logs publicly available... Is this some weird indication that we're being targeted? And by who?"} {"_id": "10700", "title": "How can I redirect everything but the index as 410?", "text": "Our site shut down and we need to give a 410 redirect to the users. We have a small one-page replacement site set up in the same domain and a custom 410 error page. We'd like to have it so that all page views are responded with 410 and redirected to the error page, except for the front page, which should point to the new `index.html`. Here's what in the _.htaccess_ : RewriteEngine on RewriteCond %{REQUEST_FILENAME} !-f RewriteRule !^index\\.html$ index.html [L,R=410] This works, except for one thing: **If I type the domain name, I get the 410 page**. With `www.example.com/index.html` I see the index page as I should, but just `www.example.com` gets 410. How could I fix this?"} {"_id": "9252", "title": "Off-site Cardholder Data Storage", "text": "Is there a service or site out there that will store cardholder data for me? I don't need any kind of transaction processing or recurring billing... I just need somewhere that I can store data on until someone in my company is able to look at it. The specific need is allowing customers to input data that will be used for credit checks. Name, Address, Credit Card(s), and the such. Google Checkout, PayPal, NetSuite, and Authorize.net seem to be what everyone suggests to me, but they don't offer what I need -- they're just payment gateways."} {"_id": "9253", "title": "Does anyone have statistics on percentage of people that use search feature on a website as opposed to the navigation bar", "text": "I was interested in knowing percentage of people that use search feature on a website as opposed to the navigation bar. Has search bar become more powerful than the traditional navigation bar for finding data on the site? Has the \"sitemap\" become obsolete? I know it might be a subjective question but if someone has statistics on this or a white/technical paper, that will be great."} {"_id": "9250", "title": "How to suppress PHPSESSID in URL for Googlebot?", "text": "I use cookie based sessions, and they work for normal interaction with our site. However, when Googlebot comes crawling our PHP framework, Yii, needs to append `?PHPSESSID` to each URL, which doesn't look that good in SERP. Any ways to suppress this behavior? PS. I tried to utilize `ini_set('session.use_only_cookies', '1');`, but it does not work. PPS. To get an impression of the SERP, they look like this: * http://www.google.com/search?q=site:wwwdup.uni-leipzig.de+inurl:jobportal"} {"_id": "9523", "title": "Webhosting for a TV channel with streaming video", "text": "I'm making a website for a web based TV channel, so I'm assuming it will be heavy on bandwidth usage, but I'm no good at calculating bandwidth. Couple of questions: 1. Assuming the site streams HD video 24 / 7 to 1000 people, how much bandwidth is that? 2. Where should something like this be hosted? The channel will have a fiber internet optic connection, but I don't know the limit on their bandwidth, would it be better to get their own server or host online? In either case in question 2, any recommendations? I'm usually a regular web designer for minor businesses, so this is a new level. Your help is appreciated."} {"_id": "24881", "title": "Cache a particular image using .htaccess", "text": "I would like to cache a particular image on my website i.e the background image. So far i only see options that allow you to define the format type(png|jpg|ico) is there anyway to let the browser cache the background"} {"_id": "21619", "title": "Is it possible for CloudFlare to cause an increase in bandwidth usage?", "text": "We started using CloudFlare at the end of May 2011 and I was just looking at the numbers and saw our usage of bandwidth went from 2-3k GB per month to 18k+ GB of use per month. Also our Unique vistors dropped from 60-80k to 800ish. Could CloudFlare be causing these numbers to change so drastically? We host podcasts ourselves and I am wondering if CloudFlare is trying to cache them multiple times causing all this bandwidth usage or something. Usage http://tinypic.com/r/jhvaq1/5"} {"_id": "24886", "title": "Will dofollow backlinks in a syndicated article improve backlinked pages' rankings?", "text": "Do dofollow backlinks from duplicate articles across different sites improve site rankings? I guess Google etc may like to discount completely the duplicate articles and backlinks in it. Shouldn't this non-sense duplication of content be penalized?"} {"_id": "52182", "title": "Share Google Analytics data excluding Google Adsense data", "text": "I have a Google Analytics provide linked with my Google Adsense profile. I want to share the Google Analytics data with some other users (I believe I'm permitted to do that by Google ToS), but I want to exclude Google Adsense data (both because they have no need to know, and because I'm not sure it is permitted by Google ToS). Is there any way to selectively set permissions to Google Analytics data when adding users to a Google Analytics account? E.g., can I add somebody with the capability to see Audience and Traffic Sources views, but not Content / Adsense views? I found a semi-relevant link at Google but this seems to remove it from the whole view, and also doesn't work as I cannot find the Data Sources. https://support.google.com/adsense/answer/98145 I don't want to flat-out remove the Adsense integration as that means I as an administrator lose this data myself (and possibly it is still visible in the view I'm sharing, not sure). Any advice?"} {"_id": "54393", "title": "How to check if a WordPress SEO plugin is working", "text": "I have setup my WordPress site with Yoast. However, it doesn't seem to appear in any of the search engines. In Google Webmaster Tools I don't see a sitemap. Is the plugin supposed to submit the sitemap and should it appear in Google Webmaster Tools? How do I know if it has been submitted to other search engines?"} {"_id": "49564", "title": "Does Google penalize Wikipedia external links?", "text": "Wikipedia external links are all `nofollow`, so it seems they shouldn't have any effect, negative or positive. But an article that I found online suggested that Google may penalize those external links. According to that article, because Wikipedia has many mirror sites, Google will see many links to your site with same anchor text and Google may consider it as spam. Also see this forum thread."} {"_id": "54366", "title": "Webserver insists on opening \"blog1.php\" instead of \"index.php\"", "text": "I'm at my wits' end. I have just ripped out a website and in the process of rebuilding everything. Previously, the 'home page' of the website is a blog, with the address \"www.mydomain.com/blog1.php\". After exporting everything, I deleted the whole directory, and -- based on request -- immediately create a `blog/` directory. The idea is to get the blog back up as soon as possible, and temporarily redirect people accessing `www.mydomain.com` to the blog. Accessing the blog via `http://www.mydomain.com/blog/` works. So I put in an index.php file containing a (temporary) redirect to the blog's address. The problem: The server _insists_ on opening `blog1.php` instead of `index.php`. Even after we deleted all the files (including `.htaccess`). And even putting in a new `.htaccess` file with the single line of `DirectoryIndex index.php` doesn't work. The server stubbornly wants `blog1.php`. Now, the server is actually a webhosting, so I have no actual access to it. I have to do my work via cPanel. Currently, I work around this issue by creating `blog1.php`; but I really want to know why the server does not revert to opening `index.php`. Did I perhaps miss some important settings in the byzantine cPanel menu page?"} {"_id": "21616", "title": "Need to track the website or link from which users landing on our web pages - jS or PHP Coding", "text": "We have three fare traffic websites ( for three different brand products ) and all are well serving our internet marketing needs. recently , we are planned to promote each website on other two websites to increase our traffic to each other there by reducing Google Ads cost to promote website. Our plan is working well, the only thing it worries us is tracking the traffic source. For googl adwords (CPC) and our ads on other websites lands at same Landing Page (Lead generation Page ) and hence we are facing difficult analyse the traffic and big problem is that our management want to access this information in readable form . Of-course , google analytic s will serve this information but for that we need to log in account and then we should take report or do some analysis . but what we are looking is something like IP Capturing .So if any persona comes from webiste page ( eg www.xyz.com/read.html if they click our ad on the site page and lands on to our website . we have to capture the web link or website from which user landed to our webiste ."} {"_id": "21612", "title": "PHP script to send notification email via hosted ISP", "text": "I have PHP 5 running on a hosted ISP. The ISP gives me minimal configuration options via cPanel. Is there any way to use email notification to respond to some given event? I want to send from PHP in the background. Must I specify environment stuff like POP or SMTP? My hosted ISP offers email addresses of course -- does using one of those make things easier? Now that you know how ignorant I am, do you have any general advice on approaching this?"} {"_id": "60099", "title": "Use p tag inside a div tag for text?", "text": "What is better in terms of SEO ?

A bunch of text right here

or
A bunch of text right here
box1 div has a lot of css applied, so in order to make the text look like/where i want i will wrap it in that div . this div in particular only has text . But for SEO purposes is it better to use the p tag inside the div or just use the text directly in the div ? The result is the same ( visually ). PS: i understand that the general idea is that p is more relevant for seo than div ( or at least this is what i heard. please correct me if i am wrong here) . But in this case my text on this page will be shown in different containers on the screen for a better look. should i use the text inside the div or should i use the text inside the p tag and the p tag inside the div for better SEO ?"} {"_id": "45118", "title": "Where to leave feedback about a bad webhost. Way to combat data loss", "text": "I've been with a certain company (50webs) for a while now. I didn't mind the downtime so much, but a couple of day ago they did the unthinkable: They went down for a while and I'm guessing restored from a week-old backup without telling any of their customers. I'm a paid customer and my sites run by storing data in *.dat files in the /www/ directory. As you can imagine I lost a week of data and worst of all an update I finished right before the crash. I couldn't get in touch with their support staff (form throws errors and phone # disconnected). That's the background. Now only the question: I want to leave feedback for the place but have no idea what's a reputable review source these days. And the less-off-topic question: What's the recommended way of backing up sites like mine that rely on .dat files for data storage? I can think of a bunch of manual ways, but is there something automatic that would for example sync across a couple ftps regularly, or say do a daily dump to google drive? Thank you guys."} {"_id": "19397", "title": "Where to place privacy, legal, etc info on a website?", "text": "Every website I've ever been to has links to legal, privacy, tos, etc on the bottom of every page. Is this required? Can I have all that info on a specific about page rather than on every page?"} {"_id": "26027", "title": "Hosting terminology explained", "text": "I am learning the Play framework, and while looking for a place to host (never hosted a site before) I find myself lost in this world. I do not understand the terminology used for pricing. I have visited several sites: * http://www.cloudbees.com/pricing-standard-services.cb * http://www.heroku.com/pricing#1-1 * http://aws.amazon.com/en/ec2/#pricing * http://www.playapps.net/pricing All have various tables with hard to understand pricing elements (some of which you need to set) such as * Build Usage (Builds per day, Average build duration, Working days per month..) * Dynos? worker dynos, web dynos? I am just looking to host a small website I developed (currently for learning purposes with an option to maybe grow). I don't know if the listed hosting options above are actually good for me, can any one explain the basics, or at least point me at the right direction? (also can I use any site that supports Java hosting such as `http://javaprovider.net/` for running a Play app?)"} {"_id": "26026", "title": "How do I access PHPMyAdmin after install in Ubuntu?", "text": "I installed Apache and then PHPMyAdmin on my Ubuntu server, but I have not been able to access PHPMyAdmin at `http://localhost/phpmyadmin/`. `http://localhost` is working however."} {"_id": "64900", "title": "Having an iframe of which inside the page has a noindex nofollow tag would it affect the rankings of the page having the iframe? Is it cloaking?", "text": "We have a website with video content delivered from a video studio and hosted on their site since we don't want to load our server with it. The content we are embedding is payed for and we don't to rank the videos (since they are hosted elsewhere and we embed them) but to rank our posts with the videos in them. The pages in the website aren't just with the embedded videos, it has a brief review of the video inside it also. Explanation of the concept and construction: Site **_example.com_** has an iframe: Inside the file **_steve.html_** which is under the same domain example.com there are the meta tags: And in the body there is another iframe video embed code from the external website. Now my questions are: _Is this considered cloaking?_ _Would we be better off without the nofollow/noindex as now Google cant see the iframed content and where the video iframe is in Google eyes appears a big nothing?_ Any suggestions? Edit to clarify it a bit here: We have a post Example.com/review Inside it there is an iframe (example.com/iframespace with no follow/noindex) inside this iframe (example.com/iframespace) there is another iframe of the video embed code. Would the iframe with the nofollow/noindex affect the post example.com/review which contains the iframe?"} {"_id": "24579", "title": "What database use for huge servers?", "text": "Suppose I had a HUGE website (which I don't, but let's suppose), the common MySQL on a single server wouldn't work, because the MySQL server wouldn't be able to handle thousands of requests per second fast enough. How can you have several database servers that link to the same data but work independently of each other? **What are the softwares available?**"} {"_id": "45113", "title": "Hosting multiple low traffic websites on ec2", "text": "We have like 30 websites with almost no traffic (<~10 visits / day) which are currently hosted on a dedicated server. We are evaluating hosting on Amazon EC2 however I'm not sure how to do that properly. * One (micro) instance per website is too expensive * ~10 websites on one instance (using apache virtual hosts) make auto scaling impossible (or at least difficult) Or is cloud computing not suitable for such a usecase?"} {"_id": "45114", "title": "Do other search engines support Google's \"hash bang\" syntax for crawling AJAX applications?", "text": "Google has a really nice document explaining how web developers can get Google to crawl non-AJAX versions of their web applications to make it possible for Google to index AJAX-heavy websites. Do any other search engines support this standard? I'm specifically interested in the fragment `meta` tag method, not just the `#!` URL method."} {"_id": "45443", "title": "How safe are the new HTML5 input tags", "text": "I am thinking to use the Input Type tags in my web application like I'm just curious how safe is to use them , Is there any options so the user will disable them from browser."} {"_id": "64908", "title": "Does the name server reported by whois get automatically updated when moving hosts?", "text": "This is regarding the nameserver info shown on who.is, example: name server: ns1.NS-example.net When a domain is transferred to another host provider, is it possible for whois to show a nameserver from a previous host? Does this info automatically get updated via the new server settings?"} {"_id": "403", "title": "How should I structure my URLs for both SEO and localization?", "text": "When I set up a site in multiple languages, how should I set up my URLs for search engines and usability? Let's say my site is `www.example.com`, and I'm translating into French and Spanish. What is best for usability and SEO? **Directory option:** http://www.example.com/sample.html http://www.example.com/fr/sample.html http://www.example.com/es/sample.html **Subdomain option:** http://www.example.com/sample.html http://fr.example.com/sample.html http://es.example.com/sample.html **Filename option:** http://www.example.com/sample.html http://www.example.com/sample.fr.html http://www.example.com/sample.es.html **Accept-Language header:** Or should I simply parse the `Accept-Language` header and generate content server-side to suit that header? Is there another way to do this? If the different language versions don't have different urls, what do I do about the search engines? * * * _**UPDATE 2011-12-06_** _Google has new recommendations for`meta` tags for explicitly pointing to other language content: New markup for multilingual content._ _**UPDATE 2012-05-25_** Related but not precisely: Multilingual and multinational site annotations in Sitemaps _**UPDATE 2013-06-12_** Targeting site content to a specific country includes discussion of several URL schemes directly relevant to the question."} {"_id": "34185", "title": "multilingual mobile site and google seo", "text": "> **Possible Duplicate:** > How should I structure my urls for both SEO and localization? What's the preferred SEO compliance for a mobile website that is multilingual ? I have - web: en: http://mysite.com fr: http://fr.mysite.com es: http://es.mysite.com mobi: http://m.mysite.com Should I use http://m.fr.mysite.com for my mobile french version ? Nothing is specified on google blog for mobile : http://googlewebmastercentral.blogspot.co.uk/2011/12/new-markup-for- multilingual-content.html"} {"_id": "53771", "title": "Do I have duplicate content issues to be worried about?", "text": "We built a website targeting different regions but with the same English language (Ireland, England and America). The content for the most part is the same set up on 3 different subfolders. `http://www.example.com/` - targeting the United States in WMT `http://www.example.com/ie` - targeting Ireland in WMT `http://www.example.com/uk` - targeting UK in WMT Do I have duplicate content issues to be worried about? If so, how do I get around this issue? Also is there anyway of finding out if Google have in some way penalised these pages for having the same content on other pages trageting different countries? I have not received any messages from Google in WMT saying there is duplicate so I'm not sure if this is an issue."} {"_id": "27723", "title": "Are there advantages of using hard coded URLs for localization?", "text": "> **Possible Duplicate:** > How should I structure my urls for both SEO and localization? On the Synergy website, localization is detected (and can be overridden) but uses the same URL for all languages. Some websites however, like Wikipedia have language specific subdomains. What are the advantages of having either subdomains or subdirectories (i.e. a specific URL) for each language localization? Also, should it automatically redirect the user to the specific subdomain/subdir based on the language that the browser requests? I suspect that there are advantages, which I'm guessing are: 1. When the website appears in search results for non-English languages, the translated page description will be shown (assuming there is a translation provided by the website). 2. When a user shares a page (e.g. through twitter), it will show in a specific language. Perhaps this is a disadvantage though? Am I correct, if so, are there more advantages?"} {"_id": "58797", "title": "SEO & PageRank optimization when using translated content across three different TLDs", "text": "We have a website **example.gr** translated in three languages (Greek, English & German). We have also registered two more TLDs **example.com** & **example.de** So far, based on what the user types in his(her) browser we have... example.gr - shows the Greek content example.com - shows the English content example.de - shows the German content The content in languages English & German is a translation of the Greek content. The company is starting to expand world wide, so I am thinking of redirecting (301 Permanent Redirect) all domains... example.gr, example.de (with or without www) and example.com to **www.example.com** Is that a good practice (SEO and PageRank)? Or it would be better to redirect example.gr (with or without www) to www.example.com/?lang=gr example.de (with or without www) to www.example.com/?lang=de example.com to www.example.com (the English content) Please advise."} {"_id": "56702", "title": "Country specific website URL for SEO", "text": "I am developing a new site and have two domains. One is the `.com` domain and the other is the country specific `.in` domain . The market for my products is both in India and and globally and outside of India. Is it in any way beneficial from an SEO perspective to have a country specific URL or does it not matter?"} {"_id": "55629", "title": "What is the best strategy for multilingual websites", "text": "I've been asked to create a site which targets multiple languages. The client is multilingual themselves so coming up with the content in different languages is no problem. My aim is to get the best SEO in all languages but at the same time not split the SEO equity across multiple domains (.co.uk, .de etc...). Is there a viable way I can create a multilanguage site under one (domain) roof, and if not, what other strategies are there?"} {"_id": "63111", "title": "different content, same url, bad for SEO?", "text": "I want to translate the page based on the accept-language headers, do I need to redirect to /$lang or having different content(actually same content but translated) in the same url is ok?"} {"_id": "55352", "title": "How do avoid dupe content with different countries+locales?", "text": "I'm considering hosting different translations of my web site via a subdirectory for each; for example: http://www.mydomain.com/en-US/ http://www.mydomain.com/en-CA/ http://www.mydomain.com/fr-CA/ http://www.mydomain.com/fr-FR/ (or I will use subdomains instead of subdirectories). The en-US and en-CA pages may have very similar content, and I am worried Google will see this as dupe content. There may be multiple paragraphs with exactly the same text, and maybe a heading will have one different word in it...so both pages may be 90% - 95% the same. Will this be a problem? How can I avoid my site ranking from being affected for dupe content?"} {"_id": "33068", "title": "What is better with 3 languages: domain-lang.org x 3, domain.org/lang or lang.domain.org?", "text": "> **Possible Duplicate:** > How should I structure my urls for both SEO and localization? I have three domains for similar content in three languages. One in each. I have now been told that the languages are better off as subdomains or even as sections. My question is, thus, what is best from a SEO point of view: * domain- **lang**. com/ * **lang**. domain. com/ * domain. com/ **lang** /"} {"_id": "1833", "title": "What's the best way to split up the multilingual parts of my site?", "text": "> **Possible Duplicate:** > How should I structure my urls for both SEO and localization? I want to have my site available in several different languages. I've seen some sites that use country code top level domains (jp.blah.com), some include the language in the URL ( blah.com/jp/questions ), and some don't even include the language in the URL and just use the browsers locality setting or cookies to determine the language to be displayed. What are the pros and cons of each approach? Are there any techniques that I missed?"} {"_id": "28895", "title": "Should I have a separate URL for each language and each page?", "text": "> **Possible Duplicate:** > How should I structure my urls for both SEO and localization? Please be specific, I already plan to change the language based on the Accept Language header, and then any user-specific overrides (in a cookie) Basically: Should I have example.com/es and example.com/cn or just example.com with different content? Situations to consider: I link you to an english language version of example.com but you are a native Chinese reader. Should you go to example.com and see Chinese? English? or be redirected to example.com/cn? Do google and bing (and Baidu) crawl multiple times with different Accept- Language headers? I'd guess not but I'd like references. Even if they did, having separate URIs would probably get you crawled quicker, because you could put each one in your sitemap. What would I do in either case if I don't have some given content translated? Like a blog post that is english only on the day it is published. Any thoughts are appreciated. Thanks"} {"_id": "30339", "title": "Moving one site in Webmaster Tools to more than one site", "text": "> **Possible Duplicate:** > How should I structure my urls for both SEO and localization? I have a Question and Answer site about immigration. now I divided it into 2 sites: **mysite.co.uk** about immigration to UK **mysite.com** with sub domains for every country, Like: _australia.mysite.com_ , _sweden.mysite.com_ , ... now I had moved All the content from my first site into .co.uk and .com site and it's sub domains to fill theme. I now that Google will detect my new 2 sites as duplicate of first on and it is very bad for SEO. and I don't think Google webmaster tools has a tool for it. so Please Guide me how to fix this problem."} {"_id": "55821", "title": "SEO - Language on Static site", "text": "We have been tasked with re-building a Wordpress site for a spanish hotel - Its currently a Wordpress site - but they want it re-designed as an attractive static HTML site - all pretty straightforward - other than the fact that it currently has a language switcher (English to Spanish and visa versa). Being a static site I felt the easiest way to approach the change of language was to duplicate the site and add text to a spanish version of the site in a /spane folder - and simply add links to the folder and to the root depending on language pref. My questionsis - would this approach have any negative SEO implications (having dual content)? Can anyone recommend a better solution if so?"} {"_id": "28399", "title": "Multiple country-specific domains or one global domain", "text": "> **Possible Duplicate:** > How should I structure my urls for both SEO and localization? My company currently has its main (English) site on a .com domain with a .co.uk alias. In addition, we have separate sites for certain other countries - these are also hosted in the UK but are distinct sites with a country- specific domain names (.de, .fr, .se, .es), and the sites have differing amounts of distinct but overlapping content, For example, the .es site is entirely in Spanish and has a page for every section of the UK site but little else. Whereas the .de site has much more content (but still less than the UK site), in German, and geared towards our business focus in that country. The main point is that the content in the additional sites is a subset of the UK, is translated into the local language, and although sometimes is simply only a translated version of UK content, it is usually 'tweaked' for the local market, and in certain areas, contains unique content. The other sites get a fraction of the traffic of the UK site. This is perfectly understandable since the biggest chunk of work comes from the UK, and we've been established here for over 30 years. However, we are wanting to build up our overseas business and part of that is building up our websites to support this. **The Question:** I posed a suggestion to the business that we might consider consolidating all our websites onto the .com domain but with /en/de/fr/se/etc sections, as plenty of other companies seem to do. The theory was that the non-english sites would benefit from the greater reputation of the parent .com domain, and that all the content would be mutually supporting - my fear is that the child domains on their own are too small to compete on their own compared to competitors who are established in these countries. Speaking to an SEO consultant from my hosting company, he feels that this move _would_ have some benefit (for the reasons mentioned), but they would likely be significantly outweighed by the loss of the benefits of localised domains. Specifically, he said that since the Panda update, and particularly the two sets of changes this year, that we would lose more than we would gain. Having done some Panda research since, I've had my eyes opened on many issues, but curiously I haven't come across much that mentions localised domain names, though I do question whether Google would see it as duplicated content. It's not that I disagree with the consultant, I just want to know more before I make recommendations to my company. _What is the prevailing opinion in this case? Would I gain anything from consolidating country-specific content onto one domain? Would Google see this as duplicate content? Would there be an even greater penalty from the loss of country-specific domains? And is there anything else I can do to help support the smaller, country-specific domains?_"} {"_id": "28322", "title": "Delay before getting URLs into Google's in web index?", "text": "4 days ago, I have submitted a `sitemap.xml` into Google's WebMaster Tool for a new website. It contains 3 entries. I have tested the site map and Google found not errors. In about how much time can I expect URL entries to be added in Google's web index? It is still 0 for now."} {"_id": "68801", "title": "How come a site is indexed but not indexed?", "text": "An example URL: islamqa.info/ar The site is indexed in Google Webmaster Tools as per this image ![enter image description here](http://i.stack.imgur.com/wDdTo.jpg) But When I use google site: operator, I get Your search - site:islamqa.info/ar - did not match any documents. Hence, the site is indexed but not indexed! Note1: All other languages, e.g. islamqa.info/en, everything is OK! Just the Arabic language has this problem. Note2: Other search engines like Bing and DuckDuckGo shows correct results!"} {"_id": "51263", "title": "Would this type of rich anchor text hurt my SEO?", "text": "I have a site structure and it kind of looks like this: Home Fruit Flavors Apple Flavors Banana Flavors Pear Flavors Plum Flavors Grape Flavors Vegetable Flavors Cucumber Flavors Lettuce Flavors Olive Flavors Carrot Flavors Cabbage Flavors On the pages **Fruit Flavors** and **Vegetable Flavors** , I have a list of links leading to all of the flavor pages in that category. For example: The page **Fruit Flavors** has links in a list to the following pages: Apple, Banana, Pear, Plum, and Grape Flavors pages. The page **Vegetable Flavors** has links in a list to the following pages: Cucumber, Lettuce, Olive, Carrot, and Cabbage Flavors pages. However, the anchor text in all of the links are what you could consider rich anchor text. For example, if I want to rank for the phrase \"Apple Flavors,\" I link to the page **Apple Flavors** with the anchor text \"Apple Flavors\" from the **Fruit Flavors** page. This pattern repeats for all of the pages under the **Fruit Flavors** and **Vegetable Flavors** category. Would this be bad for SEO? From what I've read it seems like it is, so what alternatives should I take?"} {"_id": "44793", "title": "seo optimization", "text": "I have submitted several pages to Google through Google Webmaster Tools by using a sitemap in XML format. Several pages are indexed by Google as well but the rank of those pages are in 400-500 as my link is like this: http://xyz.com/test/music/view/1333 I have noticed Google is giving more importance to pages whose link consists of those title something like this: http://xyz.com/test/music/view/1333/summer_of_69_music So if Google finds the title or search query in the link it gives more importance to those links so i have again generated my sitemap with title in all links and submitted to Google but as all links are already indexed and revised sitemap consists of same title Google is not considering new sitemap for indexing.please tell shall i delete my old sitemap as in that title is not there and my rank is coming very poor but if Google index my new sitemap i hope to get more search traffic. I think Google will not index duplicate pages from same website so i don't have other option than deleting old site. Please advise."} {"_id": "44155", "title": "How do I map a (keyword rich) domain name to an existing website?", "text": "I am not experienced technical person, and still learning but will try to explain what I have done so far and what my query is. I have a (hypothetical) domain az-studios.com On that domain I have 3 subdomains: **london.az-studios.com** **newyork.az-studios.com** **paris.az-studios.com** Each of them have 301 header redirections as follows: **london.az-studios.com** -> **www.az-studios.com/london** **newyork.az-studios.com** -> **www.az-studios.com/newyork** **paris.az-studios.com** -> **www.az-studios.com/paris** So I can maintain only one unique HTML document (that appears to be three different paths) I have setup .htaccess to use MOD_REWRITE as follows: **www.az-studios.com/london** -> **www.az-studios.com?city=london** **www.az-studios.com/newyork** -> **www.az-studios.com?city=newyork** **www.az-studios.com/paris** -> **www.az-studios.com?city=paris** This is so far the existing structure. I have recently purchased three (hypothetical) keyword rich domains: **movie-studio-london.com** **movie-studio-newyork.com** **movie-studio-paris.com** What I would like to achieve is to have these three domains pointing as following: **www.movie-studio-london.com** -> **www.az-studios.com?city=london** **www.movie-studio-newyork.com** -> **www.az-studios.com?city=newyork** **www.movie-studio-paris.com** -> **www.az-studios.com?city=paris** The only tricky thing I can't figure out is how I do that so that from a Google SEO point of view, it does not use 301 redirects, no frame. I would like **www.movie-studio-london.com** to show to visitors (and especially Google bots) as a standard website (with no funny JavaScript, links, 301 redirect, frames etc). Some of you might scream \"duplicate content\" but the websites, although using the same index.php are _very_ different. I am also aware that this could be seen as doorway but these new purchased domains really define (with keywords) my products and what the different websites are about. Any idea? Any more details, please ask... Thanks Vincent"} {"_id": "63545", "title": "Should a targeted landing page redirect use 301 or 302 status code", "text": "I have a landing page that redirects if the http referer is from a known site using a simple nginx rule: location = /index.html { if ($http_referer ~* (www.)?amazon.com) { return 302 https://mydomain.com/amazon.html; } } Should the redirect use a 301 (permanent) or 302 (temporary)? I'm thinking 302, since this is a marketing tactic. What is the best for SEO? Thanks."} {"_id": "55924", "title": "Can Wikipedia content be used on a custom wiki on a site?", "text": "I am using Wikipedia content (100+ pages) on a site just as 'articles' and changing certain links within articles from linking to wiki to linking to that source article within that article in 'my' site - would there be any benefit in putting that content into a custom wiki on the site (using same software as used by Wikipedia) - what difference, if any, would it make to how the search engines view the site overall and specifically that content in the context of a wiki as compared to just a collection of articles? (I add some of my own content as well, it is not all wikipedia albeit it is the majority)."} {"_id": "22874", "title": "Does naming my website like this cause a copyright infrigment?", "text": "I want to create a website for a particular industry. There is a term that is used quite widely in the industry - for the sake of argument, lets say the expression is \" **great scott** \". The term is used by a lot of people - and as far as I know, the expression (\"great scott\" in this example), is in the \"public domain\", in that it is used freely by all and sundry, and the term is generally understood without having to explicitly state what the words mean. Having said all that, there is an industry 500lb equivalent gorilla, which has a domain with the domain name **www.greatscott.com**. However, the domain **www.egreatscott.com** is available. What is the risk (or indeed legal basis of) a legal suit from the 500lb gorrila company if I start my website (which is in the same industry), using the domain name **www.egreatscott.com** ? Are there any legal precedents for such a law suit?"} {"_id": "22875", "title": "Looking for dedicated volunteer matching website software", "text": "I am building a medical volunteering website for a NFP using Wordpress and the Nine to Five theme. It is working (in a test environment) okay, but wondering if anyone has come across dedicated volunteer matching website software."} {"_id": "42838", "title": "My .htaccess entry is not working", "text": "The below are all my .htaccess entry: RewriteEngine on RewriteCond %{HTTP_HOST} !^www\\.acethehimalaya\\.com$ [NC] RewriteRule (.*) http://www.acethehimalaya.com/$1 [R=301,L] redirect 301 /brochure_request.php http://www.acethehimalaya.com/request-brochures.html redirect 301 /testimonials.php http://www.acethehimalaya.com/testimonials.html RewriteCond %{THE_REQUEST} ^.*/index.php RewriteRule ^(.*)index.php$ http://www.acethehimalaya.com/$1 [R=301,L] But the last entry for redirecting: http://www.acethehimalaya.com/tripdetails.php?trip_id=8 to http://www.acethehimalaya.com/destinations/nepal/nepal-trekking/everest-base-camp-budget-trek.html is not working. The code i have used is below: RewriteCond %{REQUEST_URI} ^/tripdetails.php$ RewriteCond %{QUERY_STRING} ^trip_id=8$ RewriteRule ^(.*)$ http://www.acethehimalaya.com/destinations/nepal/nepal-trekking/everest-base-camp-budget-trek.html [R=301,L] Its been like days and i m struck so can anybody figure this out."} {"_id": "2", "title": "What are the best ways to increase a site's position in Google?", "text": "What are the best ways to increase a site's position in Google?"} {"_id": "237", "title": "Basic SEO Optimization", "text": "> **Possible Duplicate:** > What are the best ways to increase your site's position in Google? What are the basic things a webmaster can do to make it's site SEO optimized? I know of adding page relevant keyword and description in the meta tag... What are the others? Thanks."} {"_id": "28884", "title": "Google is not indexing my pages", "text": "I just typed a post title from a few days ago into Google with \"quotes\" around it and nothing is showing up. Does Google hate me? I am doing nothing but white hat marketing and even took down the pictures that I ripped from other sites. Does any one have any suggestions?"} {"_id": "43973", "title": "How to give SEO support for our website?", "text": "I wanna to asking, how to give SEO support for our website? How to get high page rank in search engine for our website. Please give/share about the trick or knowledge. Thank you. Regards, Lena"} {"_id": "56674", "title": "Most of my pages weren't indexed", "text": "I've submitted a sitemap (XML) of my Drupal (7) site to Google Webmaster Tools and it says that my sitemap contains 285 pages and that only 87 pages were indexed. There are no errors and no additional info or problems. There are no duplicates. Any idea why most of my pages weren't indexed? When I Google my site ( i.e.: search the following in Google : `site:www.mysite.com/`), I get ~200 pages (results) which is better, but not perfect."} {"_id": "49377", "title": "Webmaster tools shows zero page indexed", "text": "I have a website with over 60 articles (dynamic pages), a couple of profile pages, about sections, terms of use, contacts, pretty much everything. I've added my site to webmaster tools and it seems that it is being crawled but not indexed. I don't have a site map but I have other pages which are designed in the same way, yet they seem to be indexed fine. What could be the problem? ![enter image description here](http://i.stack.imgur.com/S5wZb.png) On a second thought it seems that very few pages are crawled. _**Could it be that all my pages are on sub domains?_**"} {"_id": "29653", "title": "Promoting certain keywords", "text": "> **Possible Duplicate:** > What are the best ways to increase your site's position in Google? I am not a person from SEO background so its quite possible that my question might either is off topic or is not a valid one. i have developed a web-site (TheColorsofMySoul) using WordPress and its basically for writers, portal in itself contains a variety of topics. Though i am taking care of providing good titles, descriptions as well keywords of each posts which is being published but i want to promote some specific words in general. e.g Site is for writers and bloggers so i want to promote some specific keywords like \"online writing community\" so that if some one search for such words on Google, search engine can place my site on some top results. i know that its quite a big tasks but i am not sure how to use those words in my site so that i can able to boost the search result ranking for these words any help/resources will really be helpful for me"} {"_id": "23798", "title": "Google is not indexing the entirety of my site", "text": "I require google to index the entirety of my site: 10 or so basic sections with 80,000+ individual job offer pages. However, to date, only 2,900 pages have been indexed by google. I'd like to know why... only 2,900 but not 80,000+. I figure, if there were something wrong with my job offer pages, then there wouldn't be those 2,900 values indexed but rather 0... I have a 'page' sitemap which, link by link, leads to each and every ad (paginated) My XML sitemap is lacking though. So does this mean you absolutely need an XML sitemap when you're dealing with tens of thousands of unique pages? http://haytrabajo.mx is the URL http://haytrabajo.mx/mapa is the 'page' sitemap"} {"_id": "67140", "title": "Add my dynamic page URL to Google results", "text": "Now I am working on Google SEO. My website URL is like `http://example.com` and When I Google it I can get my website results on top of the page. But my domain has a dynamic link like: http://example.com/about/alax http://example.com/about/john http://example.com/about/max http://example.com/about/selva and so... So when I Google it with query like `alax sit:example.com` in Google text box, it shows empty result. 1. How can I add all my links to Google? 2. What I need to do for getting the result from Google?"} {"_id": "24206", "title": "Can anyone recommened some up to date documentation on SEO", "text": "> **Possible Duplicate:** > What are the best ways to increase your site's position in Google? I havnt done this for a while and know that search engines are more clever now. Does anyone know of some good tips or websites to tell me how to do SEO properly, preferably using modern standards (not the old- add 300 tags to the page)."} {"_id": "30320", "title": "Homepage not showing on Google", "text": "About six weeks ago my homepage (mayberrykayakingdotcodotuk) disappeared from the google organic search for \"kayaking pembrokeshire\" despite it having been number 2 within a few weeks of it's launch last summer. My previous site (www.mikemayberrykayakingdotcodotuk) had been 2nd for about six years and has 301 redirects for all pages to the new site. Google toolbar still rates the homepage as 3/10 and the domain is still showing in search results, just not the homepage. A little research suggests that this is most likely to be due to an issue with google treating two pages as identical content (one with www. and one with not) since the changes in their algorithms around that time and that the way to fix this is to add some code somewhere. This makes sense to me as my print advertising doesn't have the www part of the address. I have cpanel access but a limited knowledge on web coding, having picked things up as I've gone along and paid for designers etc., when needed. Would someone be able to let me know where I have to go to add the code and what code I need to add to redirect the crawlers to one page? Or is there another issue that is causing this?"} {"_id": "6351", "title": "Why isn't Google crawling on my blog?", "text": "I have a blog. I made it in August, 2010. It has been awhile, but I am unable to make Google crawl on my site. I have requested Google to crawl but there is still no response. Are there any reasons of note why it is happening?"} {"_id": "64784", "title": "Why my website doesn't appear in Google search results?", "text": "I've made a simple website for a friend's bakery and it doesn't appear in Google's search results **even if I search for the whole domain name** of the site. What can be the problem? How to make it appear? (The site)"} {"_id": "26485", "title": "How I can optimize my websites into the search engines?", "text": "> **Possible Duplicate:** > What are the best ways to increase your site's position in Google? How I can optimize my thesewebsites into the search engines? http://capablerealestate.com http://ajnaralondonsquare.co.in"} {"_id": "47854", "title": "What is the best ways to generate organic traffic?", "text": "After introducing the Google Panda, bloggers and webmaster have been facing the great challenges. Duplicate content is the key issue and spam boat traffic is second issues for Google Panda in this great critical situation, how to beat Google Panda and what are the best sources to drive organic traffic? Is there any pragmatic solution over this probolem?"} {"_id": "10281", "title": "SEO Tips and Tricks for web developers", "text": "> **Possible Duplicate:** > What are the best ways to increase your site's position in Google? I know most of the Users of Stackoverflow would have knowledge about **Search Engine Optimization(SEO)**.. Any tips and tricks about SEO for your fellow web developers ... **EDIT:** * Good book about SEO? * Good blogs/websites about SEO?"} {"_id": "49394", "title": "Debugging reason for specific site sections with significantly low appearance in SERP", "text": "I'm experiencing very little exposure for specific sections in my site compared to different sections. Most of my traffic is long-tail queries, in the popular SERP, and non-popular SERP sections. I know this is a pretty broad question, but, how would you suggest debugging possible reasons for this issue? Specific Info: * 'Problematic' pages are linked to from 'popular' pages and sitemap * Most of Titles of 'Problematic' pages are similar to title of 'popular' pages, but not exactly the same. * Most of these pages don't have many internal or external links * These pages contain many photos and not a lot of text * Google has indexed about 175k pages (from 1M), but only displays in SERP about 10. * Pages don't include 'noindex' Any insights appreciated."} {"_id": "7587", "title": "What is Google search engine interested in on HTML pages?", "text": "> **Possible Duplicate:** > What are the best ways to increase a site's position in Google? I know there are meta keywords, HTML structure, and site map but what else is Google interested in?"} {"_id": "30725", "title": "How do I make the home page of the website to come up in the rankings than the internal pages?", "text": "> **Possible Duplicate:** > What are the best ways to increase your site's position in Google? Suppose I have a website, e.g. **`www.example.com`** that comes at **number 6** on the Google search rankings. But the internal pages of the website i.e. **`www.example.com/index.php?a=1&b=2`** or **`www.example.com/index.php`** comes at **number 2** of the rankings. How would I make my prime domain name **www.example.com** to come at the top of the list ? Any guidance would be appreciated."} {"_id": "59515", "title": "What more to use for SEO?", "text": "I was wondering how I could optimise the Google results? when I type the name of my company it wil show up, but I would like it to also show up if I type (design langezwaag) or (webdesign langezwaag) because i'm the only people in the village i'm living in that is doing this kind of stuff. My company isn't registered because i'm only a student. What i'm already using: Google site search (I think this helps some) Google Analytics Share this plugin News page Meta and alt : i'm not using (I read Google doesn't indexes those anymore) No links page : heard that system of links and backlinks isn't usefull anymore. Is it possible to make it an bigger range of search terms to be found at? or any other ways to optimise the possibility to be found any more? --- edit --- Also using most and falid as possible HTML5 and CSS3 think I read Google also checks this. ---edit--- Most of the page are .php files son't know if that makes any difference."} {"_id": "59733", "title": "On-Page optimization of Home Page", "text": "How to do on-page optimization of a home page (for a industrial company/ product developement co, service provider etc) ? Home page of a site as compared to a blog is a mother page a base page where it provide snapshot of all the services the company, professional has to offer and other important info like internal page links and contact info ?"} {"_id": "20407", "title": "Some queries about Blogger SEO", "text": "I have been using Blogger from past 2 years and happy with it. Recently I read many articles saying WordPress is far more better than Google Blogger in terms of SEO. That can be true to some extent. But what I think is if you have good content and SEO optimization, your blog and site will get higher rank irrespective of your using WordPress or Blogger. **What can I do to improve my Blogger's rank?** I have been reading SEO articles from last 2 weeks and thinking of implementing some of the things in my next post. I am implementing following things for better SEO. I would love if someone would correct me if I am making any mistakes. 1.Titles are important and the URL too. Blogger allows 40 characters in URL that's why I post the title with keywords in it and then I re-post it with the actual title I want users to see. For example: if my post is about \"Best outfits\" I will post with title \"worlds best women men outfits clothing\" which has keywords and Blogger will create url like `www.xyz.blogspot.com/06/11/worlds-latest -best-women-men-outfits` Then I can re-post it with new user readable title \"World's Best Outfits For Men & Women, Latest Trends in Outfits\" 1. The first 200 words of the post are important, so I try to use the relevant important keywords at the beginning of post. But keyword stuffing can get my blog de-indexed from Google, so I try to make sentences from the keywords. For example I am writing about iPhone. I would use following line to start my post _\"Article about iPhone, iPhone's release date, its features functions & price. Is Galaxy s2 better than iPhone 4\"_ I am not sure if it is right or wrong ethically. 2. I try to give links of my another post at the starting of my current post or ending of my post. I have read somewhere that links containing keywords are weighted higher in search engines and search engines give more importance to starting and ending of the post. 3. I try using keywords in the whole post whenever it is possible, Around 7-8 times. I take care that I'm not stuffing keywords unnecessarily. 4. I use 4-5 labels(tags) for every post containing keywords and I show labels just before the post and after the post. 5. I keep at least 10 posts on my home page and label cloud and blog archive too. 6. SEO experts say Use `H2` tags for important keywords & titles. I am not sure, but does it really help? I try to put titles in `H2` or `H3` tags and if I am using titles inside the post I try to use `H2` tags for it."} {"_id": "53591", "title": "Bing crawls my sitemap, but pages don't show in results", "text": "Before voting my question down, I must note that I have searched the forum. That being said: I have a sitemap, which is pretty large (25k pages). These are products for sale. The products are used auto parts. Helping the web site \"ahparts dot kom\" So .com/sitemap.xml (the big one) is submitted to bing's webmaster tools, and says it has crawled just fine, but I can never find any of the pages in the search results - over a year of trying! I thought the big sitemap might be the issue, so I made a second sitemap \"sitemap2.xml\" whcih only contains 2.5k pages. Same story with that one. I might mention that google has crawled and indexed all of the pages on sitemap just fine! Anyone have any ideas as to what I can try? I might mention that the URLs are all without variables such as ?id=58499 or something like that. They are rewritten, cleaned up URLs. Thanks, Dav"} {"_id": "25950", "title": "How to increase visitors of my Website per day", "text": "> **Possible Duplicate:** > How can I increase the traffic to my site? My website is fully implemented with `SEO` and registered in all webmaster tool like of _Google,Bing,Yahoo_ etc.. As result, **500 pages** of my website are listed in Google and lots of images as well. There is **google site-links** are available of my Website. Thought my website does not have more than 100 visitors per day(By Google Analytic) So what to do to increase visitors of my website."} {"_id": "55804", "title": "Seo tips for blog", "text": "I've got a Blogger blog, how can I grow traffic from search engines? And what optimizations can I do for the blog?"} {"_id": "13081", "title": "How to increase site ranking? for a keyword: Sushil Handa", "text": "> **Possible Duplicate:** > What are the best ways to increase your site's position in Google? I have a website located at: http://www.fifthvedaentrepreneurs.com I am trying to get this website ranking as high as possible on Google with the Keyword \"Sushil Handa\" (promoter of the company). How do I do that? I am using various link building strategies but not very successfull so far. Please share your ideas. Thanks in adavance for your kind co-operation. Sincere Regards, Sushil Handa CEO - The Fifthveda Entrepreneurs"} {"_id": "58603", "title": "Why Google Webmaster tools showing index status 0?", "text": "I've added my site to Google Webmaster tools kanosa.com. The site was submitted a few months ago. The index status shows 0 but it shows multiple URLs have been indexed in the sitemaps section. If I search in google I can see that the site is indexed and several pages appear:"} {"_id": "61792", "title": "What replaces meta keywords if Google doesn't use them anymore?", "text": "As a majority of webmasters already know that Google does not use Keywords anymore to help index a website properly. Which is understandable by people exploiting the system to try to gain a better ranking for keywords that they don't use with their sites. My current thoughts of this questions are: * Writing _good_ content. * Meta description should still work * Allowing for the site to be easily indexed thanks to HTML markup * Google Webmaster Tools gives insight * Google Analytics tells me where my users are going and why... ? * HTML Breadcrumbs What do we replace with the keywords to help Google index out sites accordingly and more accurately?"} {"_id": "47309", "title": "index pages that only can find with search", "text": "I have a website that have 5 static pages and more than 700 dynamic contents that only can access them by search. I create `sitemap.xml` that include 5 static pages and whole of dynamic contents. But Google webmaster tools shows that Google only index static pages what do I must to do to index all pages. In Google webmaster tools is wrote that 7xx page are submitted and 5 pages are indexed"} {"_id": "53102", "title": "A new WP website and Google-indexed 0 pages?", "text": "I haven't seen this before, but google has indexed 0 pages of a WP based website i am working on. It's a simple website and the content is handled by the client, all technical stuff is up to me. It's been a week+ but i can't get the website indexed and i don't see any error/warning messages in google webmaster tools. Things i have done: - sitemap - connected the analytics Any tips suggestions?"} {"_id": "33544", "title": "SEO Team in a fix", "text": "> **Possible Duplicate:** > What are the best ways to increase a site's position in Google? I am getting a educational website developed by dedicated developers. I have a SEO team as well. We are working on the SEO and SEM for past six months and were able to get our keywords ranked in top pages. But since the latest panda update by google all our keywords went down. We researched a lot to get around but couldn't get the right direction. None of blogs gave any emphatic suggestion. Our SEO team is in a complete fix. Can anyone please shed some light on this?"} {"_id": "58594", "title": "SEO doubts regarding keyphrase and autosuggestion", "text": "I know on page optimization techniques. What is the best and proven way for offpage optimization for ranking a key phrase? Is it: 1. to build as much backlinks to your site ? 2. or build as much backlinks with the \"key phrase\" as anchor text in that link? 3. or any other teachnique? One more doubt I have. How to make your key phrase appear in Google autosuggest? For example: my companys name is \"amazing bamboo shop limited\". When I search in Google it shows upto \"amazning bamboo shop\". So how can I add this \"limited\" to that autosuggestion."} {"_id": "47142", "title": "Googlebot ignore my new site", "text": "I have the following situation: I had a blog (made using Joomla) that in some month was pretty indexed by Google. For some technical problem I have delete it and I have newly created using WordPress (so it is a different site, only the URL it is the same) and I have reinserted manually all the articles. Looking at my statistics I can see that there are many visitors that try to visit the old page (that now don't exist in the new site that have a different structure) The problems are the following one: it seems to me that Google Bot try to visit only the old pages (that don't exist) but it is ignoring the new pages of the new site (consequently not the indexes) Some day ago, I find that I have enabled the settings to deter search engines to index your site so I have change it and now, in theory, Google have to visit these pages and index it... Why this is not happening? How can I solve it? **REQUIRED INFORMATIONS:** The URL of my site is: http://www.scorejava.com/ and this is my **robots.txt** content: # If the Joomla site is installed within a folder such as at # e.g. www.example.com/joomla/ the robots.txt file MUST be # moved to the site root at e.g. www.example.com/robots.txt # AND the joomla folder name MUST be prefixed to the disallowed # path, e.g. the Disallow rule for the /administrator/ folder # MUST be changed to read Disallow: /joomla/administrator/ # # For more information about the robots.txt standard, see: # http://www.robotstxt.org/orig.html # # For syntax checking, see: # http://www.sxw.org.uk/computing/robots/check.html User-agent: * Disallow: /administrator/ Disallow: /cache/ Disallow: /cli/ Disallow: /components/ Disallow: /images/ Disallow: /includes/ Disallow: /installation/ Disallow: /language/ Disallow: /libraries/ Disallow: /logs/ Disallow: /media/ Disallow: /modules/ Disallow: /plugins/ Disallow: /templates/ Disallow: /tmp/ The strange think is that in the robots.txt comment makes reference to Joomla and WordPress...maybe is still use the old robots.txt file of the previous website (that was in the same space)?"} {"_id": "33344", "title": "Companies that provide SEO solutions?", "text": "> **Possible Duplicate:** > What are the best ways to increase a site's position in Google? I've seen a lot of those companies around, saying that they could improve my website's ranking in search engine results, The thing is, that they say you'll be on the first / second page, guaranteed. I tried to contact a few of them, and they are charging a nice amount of money for it, The cheapest I found was about 5,000$, Now I didn't believe that they could make a huge change, but I did see that a few of their customers did get to the first pages fast, while leaving bigger and more experienced companies behind, I'm quite sure that I could hear about their way of doing that, and maybe even do it myself. So do you have any idea what those companies do to improve the ranking?"} {"_id": "54522", "title": "Wordpress Website issue", "text": "I have my website in WordPress. Now the problem is if I search any keywords in Google related to website webpages then it doesn't show any webpage result in web results. But if I search in Google blog result then It is showing my webpages in Google blog results. I want to know what is problem with my webpages. Why they are coming in Google blog search instead of Google web search?"} {"_id": "55612", "title": "Getting key phrase into google index", "text": "Few weeks ago I generated some content on my site. Now I am getting many visits from google, but those visits are from phrases that were in the content I added. But the most important key phrase was not in this content - but I added this key phrase in every title of every page possible. But still this phrase is not appearing on google. So my question is: how do I get google to index my site with a specific key phrase so that my site will appear on the first page? That phrase has not many competitors, there are about 2-3 sites that are kindof big, but the rest are just as small and unpopular as my page is."} {"_id": "53354", "title": "Increase Website Traffic", "text": "I make one website for online shopping deals and offer code but last 5 to 6 months i updated every day but traffic not increased i request to all SEO Expert please suggest us. how to increase my website traffic."} {"_id": "29530", "title": "Google not indexing home page", "text": "I have two sites that I'm trying to generate traffic with, and for whatever reason, Google does not seem to want to index the home pages. Other pages on the site are indexing just fine. I need to get these home pages indexed obviously, or perhaps reassess strategy if it's not going to happen. Has anyone seen this before, or can see something wrong with my home pages from Google's perspective?"} {"_id": "30798", "title": "Google Index problem", "text": "I forwarded my site to Google, because Google is not indexing my site. I do not understand why. My website has many backlinks from other sites. What can I do?"} {"_id": "30892", "title": "Increasing traffic for music blog", "text": "> **Possible Duplicate:** > What are the best ways to increase your site's position in Google? I own a music blog with some articles and of course youtube iframes and mp3 listening plugins. I get traffic from google and some not very popular site (I don't mention it because you can think it's a SPAM) where I post links with pictures to my posts. Any ideas to get more traffic for this kind of blog? I know about Myspace, blog directories, video sharing sites, same niche blogs and relevant forums. Anything I'm missing? Thanks in advance"} {"_id": "52102", "title": "show my blog content in google search", "text": "Recently, I created a blog in Blogger.com(or blogspot.com). My problem is that I can't see my content when I search on google for it. Similarly, I created the same page using Wordpress.com which shows the content on search on google. What should I do to show my content on google search using blogger? **EDIT** : ( _Edited after this **post**_ ) More specific question - what part of a blogger blog does google search. I get the impression that it searches the header and the posts, but not the widgets, so if you are using the blog more as a website, your widget content won't get searched. I may be wrong though. Does anyone know exactly what is searched, heading, subheading, widgets and posts? Please help I am new to blogspot.com or blogger.com"} {"_id": "53206", "title": "Is it normal to not be indexed in Google after a month?", "text": "I started 2 new sites at pretty much the same time. They are both small blogs, running the exact same blogging software (even the same template). The content is different, of course. One of them was indexed in Google 3 days after launch. The other one has not been indexed at all. It gets visited by Googlebot 1-2 times a week, but it won't show up in search results, even if I do a \"site:www.example.com\" search. Is this normal?"} {"_id": "16885", "title": "Factors for websites enjoying solely on organic searches", "text": "> **Possible Duplicate:** > What are the best ways to increase your site's position in Google? What factors should we consider, when building an Q'n'A website which is solely based on organic searches? Note: I am gathering all information about starting a website driven solely on organic searches. **Even a minute detail here will help me a lot.** Please enlighten me. Thanks :)"} {"_id": "3486", "title": "My site does not appear in the first 30 or more Google pages. What could be the reasons?", "text": "> **Possible Duplicate:** > What are the best ways to increase your site's position in Google? I did this site about 2 months ago: www.pbg-bulgaria.com The problem is that I cannot find the site, for keywords such as \u201cbulgarian properties\u201d and related in the first 30 or even more Google pages. The tool I'm using for Google position says that the site cannot be found in the first 100 pages, which is quite alarming. Google Analytics shows no traffic from generic keywords (only keywords associated with the name of the company) I know that a lot of improvements can be made: there is no h1 and h2 titles, I don\u2019t have many backlinks yet, I can put some more keywords here and there, but overall I think the site is ok. In the Google Webmaster account everything looks fine as well. Do you have any suggestions?"} {"_id": "10546", "title": "Indexing and Page Ranking Issues", "text": "Hi all I am on the first page of google for keywords concerned with MOVING, however i cant seem to break the carpet cleaning rankings. I have made changes and additions which havent been indexed yet. Should i wait for the run or please please can someone give me pointers on the carpet cleaning indexing. Also i have 53pages submitted and only 38 indexed, where could the problem be. Is there software to check indexing hiccups . Thanks."} {"_id": "25210", "title": "What major SEO steps have I forgotten?", "text": "> **Possible Duplicate:** > What are the best ways to increase your site's position in Google? I'm a Java developer-turned-impromptu-web-developer trying to help my father get his woodworking website off the ground. A week ago I knew (next to) nothing about SEO, and have spent a great deal of time revamping his site to meet Google's \" _white hat_ \" best practices. Here is a list of steps I've taken, and this question is nothing more than a formal request for commentary to see if I've made any grave errors (omission or commission!) that will hurt his site's rankings: * I've designed every page to have unique, brief/terse, meaningful content * I've set him up with Facebook, Twitter & LinkedIn profiles and linked them to his website (both directions) * Each page is uniquely/briefly/meaningfully titled * I use 1-and-only-1 `

` element per page * I have a single sentence _description_ `` tag * I have a list of about 20 common search phrases listed in the _keywords_ `` tag * I have populated every `` tag with an `alt` attribute that contains a list of a few short, meaningful phrases (that are similar to the ones in the `` element * I have set him up with an HTML sitemap * I have placed a valid `sitemap.xml` in his webroot * I have placed a valid `ror.xml` in his webroot * I have placed a valid `robots.txt` in his webroot (contents are below) `robots.txt`: User-agent: * Disallow: (I want Google to crawl everything!) Several years ago I was under the impression that proper SEO involved actually submitting something to Google (like a sitemaps, etc.). But I couldn't find any reference to that while reading up on contemporary SEO best practices. Please, if anyone has any concise, effective recommendations for anything I've overlooked or done poorly, please let me know! As always, any hints, tricks or tips are enormously appreciated! Thanks in advance! **Edit:** I know its foolish to disregard other search engines, but Google is the biggest player and quite frankly, I only have the time & energy to make him rank highly in 1 of them, so Google is my choice."} {"_id": "63362", "title": "What does SEO comprise of aside from meta tags?", "text": "I have seen people add meta-tags to HTML pages containing keywords and then claim that the SEO was complete. Is this all that SEO is? And doing that, will your site automatically rank highly on a Google search page providing that there isn't too much competition for the search terms?"} {"_id": "53145", "title": "SEO: why this site won't index?", "text": "I recently built a one-page-site, All main pages are loaded via ajax, so they exist as standalone too versions as well in HTML format through jQuery, only pages' content is loaded in the one-page-site. I have created a sitemap and submitted all these things to Google Webmaster Tools, which tells me that everything is fine and pages are indexed and not blocked I also posted the site URL in some specific forums where people talk about main product (Chirofol); a Facebook page is also to be released soon Despite all these things, when I type \"chirofol\" on Google, the site is not even in the first three or four pages, and this is simply driving me mad."} {"_id": "30451", "title": "How would i rank my keywords in Yahoo search engine?", "text": "> **Possible Duplicate:** > What are the best ways to increase your site's position in Google? I am working as search engine optimizer team lead in a company and facing problem in a project which name is http://www.Prooftech.com.sg... Problem :- The Website has 10 keywords for which my client wanted the top 10 Ranking in Yahoo Singapore search engine. I have got top 10 ranking for the following 7 keywords Waterproofing, RC Roof ,Wall Leakages ,Ceiling Leakages , Water Leakages ,Roof Tile Coating ,Roof Tiles Repair in my 3 months work but still i am not getting the listing positions for Roof ,Concrete Repair ,Grouting .... I have Done lot of Bookmarking ,Blog Commenting ,Blog Creations ,Press Release,Classified Ads to get these 3 keywords in listing but there is no changes in the results.... Can any help me out from this problem so i can get Good rankings for Roof ,Concrete Repair ,Grouting"} {"_id": "54559", "title": "Adding tags for SEO in clothing website", "text": "I am building a site for a women's accessories brand. The site has a Homepage, a Store page (where all accessories are displayed), a page for each of the accessories description, an about page and a contact page. There is also a whole set up for shopping cart and checkout (irrelevant to this question). My issue is the SEO. Where can I put the keywords? The home page has only the menu and some photos. The store page displays the items and its titles. Then the specific item's page has a description of the item (pulled from the database), category and price. However, I feel like this is not enough for SEO for google ranking. Where could I add tags in this type of site?"} {"_id": "38471", "title": "What factors help in getting a site indexed by Google fast?", "text": "> **Possible Duplicate:** > What are the best ways to increase a site\u2019s position in Google? What should I do to get a site indexed on Google, fast? Taking example of Super User I just did a quick Google for a question that was 20 minutes old, to look for an answer, and it was already on Google Search - how is this possible? I glanced over this article which seems to suggest that SU has added RSS feeds (which SU has, but when I opened the feed the article says last posted 6 minutes ago, but when Googled it is 11 hours old) - which leads me to think (Based on that article, I don't know much about search indexing but I am reading at the moment) that most of this indexing is done thanks to the sitemap. is there anything else I am unaware of that helps SU questions get on Google so fast?"} {"_id": "30919", "title": "What tags should be used for SEO in simple blog posts?", "text": "> **Possible Duplicate:** > What are the best ways to increase your site's position in Google? > Order of HTML meta tags I'm new to websites in general (which is why I'm starting on blogger) and am just curious as to which tags I should be using in my posts? My posts generally include a title relevant to the keyword of my blog, an image, a video, my content, and a link to my twitter. I don't really use any tags and someone mentioned that they may help for SEO. So I figured I would ask for some advice from people who know what they are doing. **EDIT** I apologize for not clarifying, I meant tags such as: Header tag:

This is my title

Image tags with attributes: \"this_shows_examples\" What other tags and attributes are ones I should use for a well put together website and SEO?"} {"_id": "65402", "title": "Designing a SEO friendly Website, should i follow any standard HTML format", "text": "I am a web developer, developing a web site. I am considering how search engines index my pages. I observed a couple of sites which are ranking well in Google that make me doubt that I am doing it well. This is in the HTML structure of my page:

\"Web

I am going to use \"web design practice\" as the keyword for which I wish to rank in Google. Is this good practice code in this way? Are there any SEO tactics to be considered? I read few articles which talk about white hat and black hat techniques, and I'm not sure which the above might be."} {"_id": "58939", "title": "Increasing frequency of least searched keywords", "text": "I am working in an organization,I have been given an work of SEO(its not my job actually is job related task). * A complete scenario about what the problem is given below: * I live and work in India,where there is very little awareness about products that are used by disabled people like screen magnifiers,specially designed keyboards for partially blind people. * The problem is that we are not getting enough hits from specific keywords List of keyword searched collected from `Google trends`(`Location`:India `Year`:2013) * **disabled** Max search:100(Google Trends URL:https://www.google.co.in/trends/explore#q=disabled&geo=IN&date=1%2F2013%2012m&cmpt=q) * **low vision product** Max Search:0((Google Trends URL: https://www.google.co.in/trends/explore#q=low%20vision%20product) * My boss wants the hits to the website from the keyword such as `disabled,low vision product,blind,deaf,dumb` etc. through **organic searches**. * The data collected from google trends show very less numbers for the above searches. My boss wants some ideas to increase the hits from these keywords. * In other word people should search the web using this keyword.Once these keywords search increases our website may increase hits(as the people would be aware after they start searching for this keywords). * My boss wants some ideas from me.I am completely blank about how to make the user search those terms in search engines(surely i cannot force them),but i can make them aware somewhat. Can anybody provide me with some suggestions..."} {"_id": "32823", "title": "Control the ranking of my sites on Google", "text": "> **Possible Duplicate:** > What are the best ways to increase a site's position in Google? I have two sites that I both control and have hooked up to Google Webmaster Tools as different sites. My problem is that one of the sites (Site A) was created as a preparation for the other site (Site B). Although I want Site A to still exist (even on Google), I would like to have Site B score better in search engines. Is there anything I can do implicit or explicit to make Google (and preferable also other search engine) prefer Site B's content over Site A's?"} {"_id": "52140", "title": "How to enhance website product pages for SEO (Opencart, Google)", "text": "I launched a UK wide furniture business, which I can appreciate is a very difficult industry to break into but none the less I have implemented the basics by ensuring I have: 1. Good meta data (titles, description, and keywords - eventhough it is ignored by Google) 2. Adequate H1 and H2 tags 3. Sitemap submission 4. Good quality content trying to keep the keywords within a certain percentage of all the text (~4%?). This has resulted in being within the top 10 SERP on Google for about 30 different keyword combinations as shown by GWT (Not the best of keywords but still giving me an online presence) My question is what can I do to further improve my position? Any general advice which I and others can implement to take it to the next level? Home page: http://www.imbued.co.uk/ sample product page: http://www.imbued.co.uk/sonya-three-sleeper-inc-storage-drawers-white"} {"_id": "16873", "title": "Is using keywords in a domain enough to ensure a top position in Google?", "text": "> **Possible Duplicate:** > What are the best ways to increase your site's position in Google? If I want to be listed top for the query \"where is obama from\", what more do I have to do beyond registering a keyword phrase domain such as whereisobamafrom.com?"} {"_id": "36109", "title": "Site is not indexed on Bing", "text": "My site is having a good ranking on google, but the site is not even indexed in yahoo and bing. I have submitted my site to both yahoo and bing search engines 3 months ago. I also submitted the sitemap of my site to bing webmaster tools. Even so, I don't find the site to be indexed. I want the site to be indexed on both yahoo and bing."} {"_id": "22017", "title": "SEO tutorials and other learning materials", "text": "> **Possible Duplicate:** > What are the best ways to increase your site's position in Google? Please recommend one or two trustworthy SEO tutorials or any other reliable source of SEO wisdom. :) I watched the video tutorial from lynda.com and I liked it a lot, but I don't know how up to date it is. I am also reading SEObook by Aaron Wall. Do you think this is a good source? Thank you!"} {"_id": "59876", "title": "Need help optimizing meta tags for a specific keyword", "text": "I need to make my website appear when I google \"achar estacionamento\" (find parking lot in portuguese). Can you point some tips that could help me achieve this?"} {"_id": "48303", "title": "Missing from Google", "text": "Recently I noticed our site no longer appeared in Google search results. Our site has be in Google for the last 3 years without any issues. So far I've check: * In the Web Master tools it says there are no crawl errors and that we have pages indexed. * In the Web Master tools I've checked that the robot.txt file is reachable by google and works as expected. * In the Web Master tools I've test that google can reach our site and then submitted the site to be indexed. * In a page rank checking tool it says our page rank is N/A. * I've submitted a check to see if Google banned us, which came back fine. * I've added a sitemap.xml file. (Although our site content is pretty static) Our site is: www.hiveintelligence.com"} {"_id": "30485", "title": "Competing against existing well ranked Websites", "text": "> **Possible Duplicate:** > What are the best ways to increase your site's position in Google? What can I do to compete against existing established websites with Google page rank 3? I will try to add more keyword to my site by populating my website with similar articles (not copies) present on the other websites, but will my page get a good ranking since they have good SEO with lots of good articles and Facebook followers? How can I get customers?"} {"_id": "57425", "title": "How can I make my website appear in the top position of Google?", "text": "I have added some of the keywords in a meta tag, but my site is still not in the top position when Googled. How can I achieve this?"} {"_id": "22086", "title": "Best S.E.O. practice for backlinking etc", "text": "> **Possible Duplicate:** > What are the best ways to increase your site's position in Google? I'm currently working on a website that I am really looking to optimise in terms of search engines, i've been submitting between 5-20 directory submissions daily, i've validated and optimised my code and i've joined a lot of forums etc to speak of the website in question, however, I don't seem to be making much of an impact in terms of Google. I know that S.E.O. takes a while to start making an impact, and that Google prefers sites that a regularly updated and aged, but are there any more practices that can really help with organic results in Search engines. I have looked on Google itself, and a few other SE's but nobody is willing to talk about extensive S.E.O. practices as they normally don't want people knowing their formula's for S.E.O., also does anyone know of a decent piece of software that really looks into the in's and out's of your page and provides feedback, I usually use http://www.woorank.com, but only using one program doesn't show if it's exactly correct in what it's saying. If anyone could help it would be much appreciated, thank you very much."} {"_id": "10548", "title": "Move subdomain into subdirectory SEO question", "text": "I have read this article: http://www.mattcutts.com/blog/subdomains-and- subdirectories/ But I'm not 100% clear if moving my subdomain website into a subdirectory on the main domain would change anything related to SEO. I inherited this structure: > 1. Informational site related to our specific industry lives at: > http://website.com > 2. StoreFront where we sell product related to our industry lives at: > http://store.website.com > The informational site gives a lot of good information on how to use the products we sell. The storefront is primarily used for the ecommerce function of selling the products, but there is a lot of info specific to the products on that site. **Question:** Is our main domain http://website.com getting page rank credit for the product info contained at http://store.website.com? Would there be a benefit to changing the structure?"} {"_id": "17791", "title": "django websites and google", "text": "> **Possible Duplicate:** > What are the best ways to increase your site's position in Google? I have recently finished one of my side projects. Now many sites have things like robot.txts (probably not the best file to describe what i need but you get what I mean) and things of that nature not totally sure what else there is but things behind the scenes that sites use to identify themselves to things like google, yahoo, and bing(but I could have just said google! zing!) I was wondering, what I needed to add to my site for it to be fully web compliant and searchable and proper. I am basically looking for best practices. Thanks."} {"_id": "48919", "title": "What do first for SEO activity?", "text": "I'm newbie in SEO tasking but I motivate to be SEO master. I have done create any website. So, whats do the best first for SEO activity for good SEO in the future? How long any website can be high PageRank?"} {"_id": "67687", "title": "SEO of your website", "text": "Hello I am now learning new things to apply SEO of my website. What are the concepts and tools which I need to follow or use to make my site optimize and how to check about my website is it optimize or not. Thanks in advance."} {"_id": "22837", "title": "Is there any good authoritative source of information on SEO practices that is backed up by data?", "text": "> **Possible Duplicate:** > What are the best ways to increase your site's position in Google? Any google search for anything about SEO yields more articles than you can shake a stick at, but lot of the articles are out of date, many have conflicting advice and I just about none of them ever give any reasons/proof/data to back up their claims about what works and what doesn't. Has anyone done any at least somewhat scientific tests to see what works and what doesn't (and ideally why?) or has anyone from Google released any non- basic information about best practices? Really what I would love to do is A/B test different SEO techniques, but the time lag and sheer number of variables makes it very difficult. Has anyone ever tried this type of thing? (And published their results?)"} {"_id": "54052", "title": "If Googlebot is crawling 25 pages per day, why isn't my website indexed by Google?", "text": "I have created a word press site which is 2 months old and has 15 posts. The average crawl rate is 25 pages per day. Is that sufficient for my site to get some organic traffic from Google search? If it is not sufficient, what should I must do to increase crawl rate as well as site traffic organically? My site was http://how2impress.com/"} {"_id": "52634", "title": "Generating Back lInks", "text": "I am new to this field of SEO. My site is intended to target local customers only. I want to generate quality backlinks for my site. Is there any tools to generate quality backlinks. Or blog sites to improve count of backlinks. I am trying to improve my page ranking . Any Other SEO tips also would be highly appreciated! Thanks in Advance"} {"_id": "43356", "title": "SEO not indexing my site: \"A description for this result is not available because of this site's robots.txt \u2013 learn more.\"", "text": "When googling my site I get following text under the result for my domain: > \"A description for this result is not available because of this site's > robots.txt \u2013 learn more.\" My robots.txt, which resides in the root of my V-host, looks like this: User-agent: * Disallow: I did have a problem with this file before since I forgot to change it from the test environment and then I had User-agent: * Disallow: / However, I did change it yesterday or the day before. I have been googling and this seems to be a correct way of handling it as it is now. Checking the access log of httpd for today, I can see that google has tried to it. But the access_log also shows that Googlebot has been accessing the most of my links as well, feels like ok indexing... Nothing in the error_log from today about robots.txt. What I cannot find is if google has been indexing /index.php. But I guess folling entry is the same, isn't it?: > 66.249.75.179 - - [07/Feb/2013:15:23:01 +0100] \"POST / HTTP/1.1\" 200 1 > \"http://www.mydomain.ext/\" \"Mozilla/5.0 (compatible; Googlebot/2.1; > +http://www.google.com/bot.html) What is my problem and how to I resolve it? **UPDATE** When testing in Googles Webmaster Tool I get (this is for my .com domain, all above as well): > Blocked by line 2: Disallow: / Detected as a directory; specific files may > have different restrictions And when doing the exact same thing with my .se domain, which points to the exact same content, I get: Allowed by line 2: Disallow: Detected as a directory; specific files may have different restrictions"} {"_id": "65336", "title": "Links removed from google search results", "text": "For my website www.trickwarehouse.com i have created one post on keyloggers. When i search in google that `what are keyloggers?` then my website's post position was 16 but after 1 hour, when i checked once again then **BOOOM!!!!** nothing found for my website within top 100 results. and also other keywords which was in top 100 results was shifted behind by 2-3 results...... Why it is happening? Any answers?"} {"_id": "56520", "title": "OpenCart search engine optimization!", "text": "What kind of procedures would be important to optimize opencart webshop for car parts? Thanks in advance"} {"_id": "53822", "title": "Google crawling pages but not adding them to index", "text": "I have a website for which Google has \u2014 according to Webmaster Tools \u2014 been crawling 1000-5000 pages per day. However, the number of pages _indexed_ (as judged by both the Webmaster Tools count and the result count for a `site:` search on Google) has remained fairly static for weeks now. What's going on, and why is this happening?"} {"_id": "10533", "title": "What are the best SEO techniques for a professional blog?", "text": "> **Possible Duplicate:** > What are the best ways to increase your site's position in Google? Beginner to SEO here, starting with a personal site, looking for some insight and feedback. Question: what's more important, domain name or site title? Question: how important are the meta tags (description and keywords) on your site? Description should be under 60 chars right? How many keywords is ideal? Question: #1 most important SEO principle = ?? (my guess is getting others to link to your site) -thanks."} {"_id": "25656", "title": "How to be the first site in google search?", "text": "> **Possible Duplicate:** > What are the best ways to increase your site's position in Google? I searched the title of my website in google search, and although it is a very specific tile, I didn't find my site. My question is that how can be the first sites in google search? I heard that google doesn't look up ``s anymore and using another method.Is it true?"} {"_id": "21976", "title": "Question about SEO and Domains", "text": "> **Possible Duplicate:** > What are the best ways to increase your site's position in Google? > Move subdomain into subdirectory SEO question This is my first post on here as I am mainly on Stackoverflow and Serverfault. I have been programming for at least 10 years now, have made hundreds of websites but I have just recently started getting into Design and the SEO side of sites, sad that I have been overlooking these for so many years. I have pretty good knowledge from all my years of SEO but I have never really looked into it until now. My question, I would like to build a site that targets many different key words for the search engines, for an example. Let's say I built a site about Outdoor activities called `outdoorreview.com` and I planned on having many sections `hunting` `fishing` `Hiking` `camping` `cycling` `climbing` `etc...` For best Search Engine results, how could I get the most search engine traffic to all these ares? Also how should I structure the way to get to them, `outdoorreview.com/Hiking/` or `hiking.outdoorreview.com` ?"} {"_id": "37599", "title": "How can I increase traffic to my website?", "text": "> **Possible Duplicate:** > How can I increase the traffic to my site? I developed a site with CMS wordpress, I am currently in the phase referencing, I would like to know how I can have to increase traffic for site visits."} {"_id": "52834", "title": "Facebook and Website SEO", "text": "I am the webmaster of a website, and one issue that I'm having is getting our home website to appear above our Facebook page. Are there any Google Webmaster Tools or meta data tricks I can employ to make our main website rank higher than our Facebook page?"} {"_id": "19356", "title": "How do I get the word out about my site?", "text": "> **Possible Duplicate:** > How can I increase the traffic to my site? This has plagued me for a long time now. I just can't seem to grasp internet marketing very well. I recently started a website to help people with anxiety, now I just have no idea how to make people aware of it. The website is free and doesn't even have ads. I don't have the money to put into Adwords right now...I've asked people to link my site but nobody wants to because they think it will take away from their own visits. How can I bring more hits to my site?"} {"_id": "43350", "title": "Why aren't search engines indexing my content?", "text": "> _This is a general, community wikicatch-all question and answer pair > intended to address any questions concerning the reasons a site or specific > site contents do not appear in search engine results._ > > _If your question was closed as a duplicate of this question and you feel > that the information provided here does not provide a sufficient answer, > please open a discussion onPro Webmasters Meta._ My site (or specific pages on my site) is not appearing in search engine results. Why isn't my content indexed and what can I do about it?"} {"_id": "68513", "title": "Must-do regarding on site SEO", "text": "I read a lot of articles regarding on site SEO. I focused mostly on Yoast SEO recommendations (since i'm using Wordpress) and Google guidelines. * Structured headings * Optimized site speed * Breadcrumbs * Updated sitemap * No-follow all external links * No index pages with content that is not unique What else should i do, to make my site more SEO optimized? Any recommended reading is welcome :) After making those changes, how soon should i notice some positive changes in page rankings?"} {"_id": "11623", "title": "Order of HTML meta tags", "text": "A SEO company suggested we change the order of our HTML meta tags so that `` and `<meta name=\"description\">` are the first two. They say this is to ensure search engines can utilize these two tags. I've been under the impression that the order of tags inside the document's head is not significant. Have I been wrong? Are there really search engines that assume that the first two tags are always `title` and `description` and give up looking for them if they aren't?"} {"_id": "18730", "title": "SEO optimizing Blogger blogspots", "text": "> **Possible Duplicate:** > What are the best ways to increase your site's position in Google? What are some way to optimize a Blogger blogspot for SEO? I have looked at several other articles, but they were old, so the tactics used won't work with the newer Blogger."} {"_id": "47231", "title": "site appears in Google Webmaster Tools but no site: search results", "text": "My site is geocology.ca. About a year ago I attempted to add geocology.ca to Google's search index by following the steps for 'Moving Your Site' in Google's Webmaster Tools, transferring from my old site at hughstimson.com. Everything seemed to go fine. * I submitted a sitemap for the new site, and it was read successfully. * I added a robots.txt file with a single entry ALLOWing all user agents, and it quickly showed up in Webmaster Tools, properly registered. * When I go to Google Webmaster tools there are no errors and no complaints. **BUT** : my site never appeared in Google's search results. At all. Searching for site:geocology.ca produced no results. After a couple of months I added a small wordpress blog in a sub-domain, as a sandbox for a client. That blog quickly showed up in Google's index, and now if you search for site:geocology.ca the content of that (fake) blog is all that will appear. Does anyone know how I can address this? Is there perhaps a way to completely reset Google's knowledge about my site and start again?"} {"_id": "47766", "title": "Why do all pages bar the home page turn up in Google?", "text": "Our website is made with WordPress. The menu pages all turn up in Google searches, but the index page doesn't. The index page is absent; the others are there. Why would this be?"} {"_id": "23501", "title": "Landing page not appearing on google", "text": "> **Possible Duplicate:** > What are the best ways to increase your site's position in Google? I a have a website which describes every product made by the company and it appears on google, which is fine. Now i also made a nice landing page, in its own domain, for one product. The page shows a demo video and has a link to the full website that the visitor is supposed to click if he wants to learn more about the product. The problem is that the page does not have text content, just the video and the link. So Google doesn't know what it is about. I added meta keywords and alt attributes but this didn't help, the page is not listed on the results. What should i do to make it appear on Google?"} {"_id": "3472", "title": "Site not indexed by Google", "text": "I typed the following on Google: (1) site: uniquevolve.com (2) site: uniquevolve.com.au Only (2) gets indexed. Is this normal? The strange thing is, my Website is actually hosted on the server with domain (1) and domain (2) only redirects to (1). How do I get (1) indexed by Google? Thank you in advance."} {"_id": "33343", "title": "Did what I can to keep my website SEO friendly, yet newer websites are ranked better?", "text": "> **Possible Duplicate:** > What are the best ways to increase a site's position in Google? I have a website that gives information about weddings, My site has been up for almost two years now, I designed it and programmed it entirely, While developing my website I tried to keep it SEO-friendly, I read about SEO at random websites, read articles about it and such. Nowadays my website is still quite bad with the \"big\" words such as \"Wedding\" (Not in english) My budget for advertising is not the biggest, I am thinking of advertising with AdWords but I don't think that it really affects SEO, does it? The thing is that about two months ago, I found a new website that does pretty much what my site does, it was after me in google's results, but suddenly started climbing up, now it's on the first page. How does one do such a thing? I looked into the domain history, and is newer than mine as well. I never thought that it's possible to go up so fast , I saw some companies that go around and say that you will be on the first / second page for sure if you pay them a nice amount of money, but I thought that I could skip this because as far as I heard SEO is not really controlled by these people, what they could do is just change a few things in my design to make it even more SEO-friendly, and maybe link to my site from random forums and websites. Am I wrong though? can they really guarantee such a thing? Is there any \"SEO Magic\" that I dont know of? I'd love to hear what you guys have to say about this and know more about how to improve my website's ranking Thanks"} {"_id": "68954", "title": "Problem indexing my site", "text": "I developed a site but I've got some problems to index it. I've got 10 pages but when I search my site on google the first result is the page \"contact\" and not \"index\". I have included title and meta tag description and keywords in the index page but the result is the always the same."} {"_id": "39529", "title": "Blog is not indexing in google search", "text": "I have a blog with url http://webzpedia.blogspot.in which is about 2 years old. In early days there is 200+ pageviews coming from the google search itself. But from last 4-5 months, google stopped indexing my blog in its search results. Can anyone answer why this happening?"} {"_id": "30915", "title": "How to promote travel blog?", "text": "> **Possible Duplicate:** > What are the best ways to increase your site's position in Google? > How can I increase the traffic to my site? I know this question might seem a little off topic, but blogging may become important part of travel. Nowadays, in time of Facebook, Twitter and similar services, keeping a travel blog may seem a little _archaic_. It's not 2005 anymore. But a lot of my travel colleagues update their blogs and have significant number of readers. I also tried to keep my blog when I travel. However it seems that the only reader is my mum ;) **What is your advice on promoting a _travel_ blog?**"} {"_id": "10265", "title": "How to start learning SEO", "text": "> **Possible Duplicate:** > What are the best ways to increase your site's position in Google? I'm a web developer (.NET) and a newbie to the SEO world! Where and how to start ? What are the steps to go from a newbie to be a more experienced SEO ? Thanks in advance =)"} {"_id": "34151", "title": "My website not getting any traffic, How to get traffic?", "text": "> **Possible Duplicate:** > How can I increase the traffic to my site? I have done pretty nice SEO of my website , the website is made in php , anyone can submit article and the submitted articles are moderated by the moderators , the site is online from more then a month but still the user count is 10-20 only total impressions are 700 according to webmaster google , how much time does google webmaster takes to refresh the data , cause from 3 -4 days the impressions shown in the dashboard are 700 only , I am posting 2 article each day , please help me , I am very disappointed with all my effort and i really need a good motivation to carry on my work. Please help my website url is http://www.viewloud.com"} {"_id": "55223", "title": "Is Anyone Facing Ranking Problem with their Website and indexing is quietly fine?", "text": "My Website gosarkarinaukri and sarkarinaukricareer both are Indian Educational Portal, if i update my pages on these websites, it get indexed by Google but they are not ranked properly as they used to do. Problem is, when i copy entire title and paste in Google, Google doesn;t show any Result from my website? Is anyone facing same problem, suggest me any good."} {"_id": "35284", "title": "Only 10% of my site are indexed", "text": "I'm newbie in Google Webmaster Tools and I use sitemap.xml to submit urls. The problem is that I've submitted 150 urls and only 17 are indexed! Is that a problem or they will be indexed later?"} {"_id": "15568", "title": "Hoping to boost traffic to NFP site. SEO info? HELP!", "text": "> **Possible Duplicate:** > What are the best ways to increase your site's position in Google? I'm doing work for a not-for-profit looking to boost traffic to the site. I need to learn a tremendous amount about SEO and backlinking in a short length of time. Any tips you can give me, or sites you can refer me to, I'll be most grateful. A million thanks, Chaz"} {"_id": "217", "title": "How can I increase the traffic to my site?", "text": "I don't want to do link exchanges or anything shady. I just want some legitimate ways that I can direct a little bit more traffic to my site. How do I do that?"} {"_id": "47193", "title": "Google Webmaster Tools Index Status is 0", "text": "Statistics show all entries of my wordpress blog have been tracked, but none have been indexed. I thought it would be just a matter of indexing delay, but it has been 3 weeks already, and I can find the entries on google when I query the entry name, (e.g \"relaxacion.com musica para dormir bebes\" shows at the bottom of search results first page) Is there any problem with my wordpress setup?"} {"_id": "16288", "title": "How can I bring my website up in the google search result?", "text": "> **Possible Duplicate:** > What are the best ways to increase your site's position in Google? I have developed my website few months ago and I added meta tags and described everything as much as I could but my site is still not on the top when I search kashif ejaz on google. Please help !"} {"_id": "50023", "title": "After replacing all tables in an old website with divs, what other steps should I take?", "text": "I have designed a website a few years back, and it ranks pretty well, customer is happy, no problems there. I took one of the pages and replaced manually all the tables with `<div>`, used structured data and got the website to look exactly the same. I would like to know what other steps should I take to improve or at least not hurt this page's rank, or if perhaps I should just not bother altogether. What are best practices here? By the way, the page is not live yet."} {"_id": "35517", "title": "Google crawls my website too slowly", "text": "> **Possible Duplicate:** > What are the best ways to increase a site's position in Google? I think Google crawls it too slowly (every three days). I updated it daily and it's kind of frustrating. Should I build more backlinks? The robots.txt looks ok."} {"_id": "54482", "title": "Why isn't my website ranking highly and what can I do to improve it?", "text": "What I can improve to make my website rank better on search engines for my target keyword - \"address labels\". I've tried a lot of things to make it better but I'd love to hear some more suggestions! the address is www.thelabelroom.co.uk"} {"_id": "55014", "title": "Increase traffic to site", "text": "I have made a social networking site and it's been on the web for over 6 months and it has over 8,000 members but I want it to grow bigger. What tools/methods can I use to grow its popularity? e.g CEO, PPC advertising Tools/methods requiring money and without and comparisons? Thanks in advance, Jack."} {"_id": "21845", "title": "How to be on google first page results?", "text": "> **Possible Duplicate:** > What are the best ways to increase your site's position in Google? I searched the web and find a lot of \"techniques\" to be on the first page results of google or yahoo. Is it possible to pay for that?"} {"_id": "25417", "title": "Site not indexed in Google, but indexed in Bing and Yahoo", "text": "When I search for mytgp.ph on Google, I don't get any relevant results. But when I search using Yahoo or Bing, the site is the 1st result on the first page. Any ideas? Here's my `site:mytgp.ph` query on Google: ![enter image description here](http://i.stack.imgur.com/GuwCW.png)"} {"_id": "57786", "title": "Our indexed website appearing and disappearing from Google SERP", "text": "Our website www.skadate.com used to appear on Google search when searching for a keyword \"dating software\", now it doesn't. It doesn't come by any other keyword (it used to come up with) as well. Sometimes our index page appears on SERP, sometimes doesn't', usually a PHP page appears. Why is it happening, any ideas?"} {"_id": "61673", "title": "cannot find my site in Google/Bing - live 4 months", "text": "I have a site, led-lights.ie, which went live in January and it is still getting just a handful of visitors a day. I cannot find it searching Google using various terms even after going through over 20 pages of results. However, site:led-lights.ie shows over 100 pages in google also its images are indexed. Google WMT tools shows no errors and that the XML sitemap has been processed. An Alexa site audit got a score of A/91%, (SEO / Performance / Security / HTML scored A, Reputation C ) only issues being Duplicate or long title tags and duplicate meta descriptions. As far as I can tell, the domain is \u201cnew\u201d i.e. no previous usage or penalties. Can anyone see if I\u2019m missing something obvious with this site SEO wise? Thank you."} {"_id": "47923", "title": "How to get the content right for better showing in google results", "text": "I am looking for basic help or guidelines for making my site earn better visibility in google and other search engine results. My site is a price- comparison engine for products so I want to know what shall feature in my meta keywords, meta description, images and content. For example, you can think of Nokia Lumia 920 as one of the products."} {"_id": "25585", "title": "SEO/performance - building blog article page", "text": "> **Possible Duplicate:** > What are the best ways to increase your site's position in Google? I've been using WP up and until now, but want to spread my wings and build a bespoke blog for my website. The reason I want my own blog is that I cannot get my WP site to match the branding/design of my main website, and because my blog is very basic, there is a ton of stuff on WP that I really don't need. I'm using ASP/Windows IIS7 and the URL rewrite module, which is all working fine (tested), plus I've built my database and administration pages. I'm now about to tackle the article page itself. Before I get started with it, I would like to hear some suggestions/ideas/tips on building the page correctly, mainly for enhancing SEO rankings, but for general performance too. I know using heading tags (h1-h6), paragraph tags (p), and alt/title attributes will enhance rankings if used correctly. I'm also familiar with meta keywords and description tags, especially being dynamically produced. What other building requirements are there that will help with successful SE ranking and general performance? Is there anything else I should be aware of when building this article page? And also, I'm not bothered that my user cannot remember the URL of the page, but will using a URL like www.mydomain.com/articles/12334435544 be a problem to SE's, using only numbers (representing article ID) and no text?"} {"_id": "8507", "title": "What is the best way to get robots to crawl your site from google?", "text": "> **Possible Duplicate:** > Why isnt google crawling on my blog? What is the best way to get robots to crawl your site from google? With reference to our site [site elided]"} {"_id": "22870", "title": "New webserver offering HTML for download, not for browsing", "text": "I recently purchased a hosting plan from a provider. They gave me temporary url of accessing the hosting space, controlpanel, ftp details. Now when I deleted all the files from public_html and put up a index.html, then on clicking on the temporary url,everytime a files gets downloaded (with download name). What can be the issue?"} {"_id": "19391", "title": "Sitemap and url are laid out differently from each other", "text": "There is a website and the layout looks something like below: home->approved-> education entertainment business | | | kid, teenager, adult kid, teenager, adult kid, teenager, adult | | | and so on.. and so on.. and so on.. home->not approved-> education entertainment business | | | kid, teenager, adult kid, teenager, adult kid, teenager, adult | | | and so on.. and so on.. and so on.. **On homepage, one would see two links - approved and not-approved.** Once you click on any of the links, relevant entries will come up and then both pages also have filters on top for further drilling, see above. Urls are organized like below: Home/Approved will pull up all entries. Home/Approved/Education will pull up education only (and so on). Home/Approved/Education/Kid will pull up educational kid only (and so on). **I am planning to organize my sitemap to have two static parent nodes - approved and not approved, and then add all entries dynamically to the sitemap to these two parent nodes and not include sublevels in sitemap at all.** Do you see any problem/pitfall with this approach - in terms of seo or anything else?"} {"_id": "27808", "title": "What should a page's minimum word count be in order to be effectively indexed?", "text": "> **Possible Duplicate:** > Ideal word count per web page? I'm seeding a new site with hundreds of (high quality) posts, but since I am paying per word written, I'm wondering if anybody in the community has any anecdotal evidence as to how many words of content there should now be for a page to be counted just the same as a 700 word+ post, for example? I know there are always examples of pages ranking well with, for instance, 50 words or less of content, but does anyone have any strong evidence on what the minimum count should be, or has anyone read anything very informative in regards to this issue?"} {"_id": "24082", "title": "SEO - Is it right that only 20% of visitors come on the homepage?", "text": "few days ago I was discussing with a xoogler about percentage of people incoming on a website, and how many of them really come on the homepage. He said that in a website that has been studied well only the 20% of traffic comes from the homepage. I didn't know much about this percentage so I ran to my computer to find out more, but I couldn't find nothing about this percentage. Of course make a xoogler speak is pretty hard, so I'd prefer to make a search by myself, do you have any link of knowledge about this? I understand that the homepage does not have the \"really\" content an user may be searching for, but I'd read more about who and where did take this percentages."} {"_id": "26572", "title": "command-line zip not working", "text": "I have a Wordpress site on a Debian/Linux dedicated server, with a Backupbuddy plugin for making automatic backups. The plugin, however, gives an error 'Your server does not support command line ZIP'. My knowledge of Linux commands is very limited, but I managed installing zip with the command `sudo apt-get install zip` However, I still get the same error message. Plugin documentation mentions the problem could also be caused by disabled exec() or safe_mode - but exec isn't disabled, and safe_mode is off. Any ideas what might be causing this, or how to fix it? The only thing I could think of, is it might be caused by wrong permissions?"} {"_id": "58071", "title": "Can inbound links through template-based layouts result in penalties?", "text": "So obviously link building is encouraged as long as it is natural, organic and has meaningful links with content relevant to your site. Obviously with the constant release of new updates to algorithms, Google is flagging sites for unnatural links to their sites. My Question is: * Can this be caused by templating systems? With WordPress for example, where you can add a link on the footer and it is repeated throughout the entire website generating thousands of links? * If we don't add any links, Good Content will be re-posted and linked to, surely if your content is constantly linked to this will flag your site for \"unnatural\" content as it's difficult to see if someone has been paid to write an article on your content. * Or does Google just simply want us to audit some of the links to show we are making an effort? As you can tell we have had a Manual action for: `Unnatural links to your site\u2014impacts links`. However, this seems to impact our website as well. Edit: To clarify the question: Can you get penalised for paying for advertising on a site that uses a templated sidebar. So when they create a new blog/page ect your link is also added onto the page hence resulting in 1000's of links to one page on our site. I know that one effect maybe a 0 pagerank web page linking to your page dilutes the PR of our page. However the links are only inbound not reciprocal"} {"_id": "37792", "title": "Remove SSL from Google Search results", "text": "I am working on a WordPress site that serves up HTTP pages statically. The problem is that some of the pages are shown as HTTPS in Google Search results. For instance, if the search term \" _Example Press Kit_ \" is entered the search result site link comes up as: https://example.com/presskit/ We don't have a site SSL certificate, so surfers are being bounced. I have tried everything. Most recently I created a new website in Google WebAdmin for the HTTPS version of our home page. Then, I added site links that should have redirected site links intended for `https://example.com/*` to `http://example.com/*`, but it doesn't work! Google still shows a dead link to `http://example.com/presskit`. I didn't think dead links lasted very long on Google results, but there they are, two weeks later. Any ideas? A `301 redirect` won't work because the site doesn't even allow redirection from HTTPS. It is served by CloudFlare."} {"_id": "58073", "title": "Does shortened URL carry less weight in terms of SEO?", "text": "Which backlink would carry more weight in terms of SEO between the original URL to a webpage (like `http://www.example.com/Zqe9`) or a URL shortened from Google (like `goo.gl/Zqe9`)?"} {"_id": "46866", "title": "How do Expires headers and cache manifest rules work together?", "text": "I find the W3C's official Offline Web Applications specification to be rather vague about how the cache manifest interacts with headers such as ETag, Expires, or Pragma on cached assets. I know that the manifest should be checked with each request so that the browser knows when to check the other assets for updates. But because the specification doesn't define how the cache manifest interacts with normal cache instructions, I can't predict precisely how the browser will react. Will assets with a future expiration date be refreshed (no matter the cache headers) when the cache manifest is updated? Or, will those assets obey the normal caching rules? Which caching mechanism, HTTP cache versus cache manifest, will take precedence, and when?"} {"_id": "54543", "title": "Is Google showing ads from other companies or WordPress plugin is injecting ads which are not mine?", "text": "I am using WordPress for managing my blog website. I am using an Ad Injection plugin to place my ads where I want in a post. The plugin seems very good, but I saw some ads which were not from ad networks which I was using. Now I have injected a Google ads leaderboard at some place but I saw advertisement from this advertiser: http://www.adroll.com/about/privacy?utm_source=evidon&utm_medium=AdChoices&utm_campaign=privacy%2Bpolicy So I want to know whether ad injecting plugin is doing some trick or Google is using ads from other networks too?"} {"_id": "38502", "title": "Redirection based on user's location", "text": "I'm about to redirect my French users from mysite.com to mysite.fr. Is it possible using **DNS CNAME records** or should I write some **GEOLocation** codes to do that? I will appreciate any other ideas. Regards, Shakib"} {"_id": "54169", "title": "Does iframe affect SEO of its parent page?", "text": "I would like to know that, does _iframe_ affect the SEO of its parent page (the page contains _iframe_ )? I've done some searching, such as Do we still need to avoid using frame/iframe for good SEO? and Using iFrame: SEO and Accessibility Points, which tell me that: * **The content in an iframe** is not considered part of the parent page. * **The page within an iframe** may be spidered and indexed (or it may be not) but no PR is definitely passed. But these are the content **in the iframe** , what about the parent page? Does the PageRank of the parent page will decrease because the _iframe_? Or maybe Googlebot wouldn't crawl the parent page? Or is the parent page not affected at all?"} {"_id": "38506", "title": "Does browser understand PHP?", "text": "I know that PHP is server side scripting language, but I got confused when asked in an interview if the browser understands PHP? I have Apache installed in my system. If in a PHP file I write `<?php echo 'HELLO' ?>` and I open it in a browser it shows HELLO. Now how does it happen if the browser does not understand PHP?"} {"_id": "3513", "title": "Using CSS sprites and hidden text - effect on SEO?", "text": "I am using CSS sprites for buttons on my website, They are used for stop/play/download etc. If I'm using for example an `<i class=\"preview\">preview \"track name\"</i>` play button with the text then I hide the text using `text-indent:-9999px`. Would this be enough to get penalised? (keep in mind there can be 60 products per page and each will have 3 sprites with hidden text!)"} {"_id": "68170", "title": "practical reasons for using separate domains instead of subdomains?", "text": "Quite often, I see web sites that draw resources from multiple domains. Made up example: Foo Bar Ltd has a main web site at foobar.com, which references files at fbltdcdn.com and foobar-rewards.net. They send emails directing people to foobarlatestmarketingslogan.com, which immediately makes me think \"phishing scam\". Why is this a common practice, rather than using cdn.foobar.com, rewards.foobar.com, slogan.foobar.com, etc?"} {"_id": "56573", "title": "Relative link to a subfolder", "text": "Let's assume that I have a website with three files: `/home/blog.html` `/home/picture1.jpg` `/home/files/picture2.jpg` Now let's edit that _blog.html_ file. When I want to make a link to the current folder, I'll do it this way: `<a href=\"./\">Current folder</a>` But what's the right way to make a link to a file in the current folder, to a subfolder or to a file in a subfolder? It can be either: `<a href=\"picture1.jpg\">Picture 1</a>` `<a href=\"files/\">Files</a>` `<a href=\"files/picture2.jpg\">Picture 2</a>` or `<a href=\"./picture1.jpg\">Picture 1</a>` `<a href=\"./files/\">Files</a>` `<a href=\"./files/picture2.jpg\">Picture 2</a>` As I checked, both ways work fine in my browser. But I would like to know which of them is the preferred way? Is any of them in some detail slightly better?"} {"_id": "2689", "title": "How to Distribute File and get Download Statistics?", "text": "One of my Clients wan't to distribute there monthly print-magazine also as free PDF download. Similar to HackerMonthly. We are currently not using any CMS so we are open to every solution. I could be done with Google Analytics if I specify a separate Goal for every file that will be available but I hope I do not have to go there. Important statistic would be: * # of Downloads per File * Geographic region of the downloader moderator: add tag \"file-distribution\" if you consider it appropriated"} {"_id": "48628", "title": "How much Google takes to deindex pages after disallowed from robots?", "text": "By accident I have let Google index lots of junk pages. Now I have added them to disallow in _robots.txt_. Will Google completely remove the pages after seeing the are disallowed from robots? Please note that I have far to many pages to manually remove these pages in Google's Webmaster Tools."} {"_id": "2685", "title": "Set the base url of any menu item in Joomla", "text": "I'm not sure if this belongs here, superuser or stackoverflow I'm using virtual domains, which means that www.company.com has one template www.community.com has another template user accounts are shared the site is also in 2 languages now there is a common global Menu navigation and some links should be on the company.com url and some on the community.com url. however menu links stay on the current domain I'm on for example: 1. i'm on company.com/de/about 2. click on forum link 3. want to land on community.com/de/forum however I land on company.com/de/forum but i don't want to use \"external links\" because if I am on community.com/en/forum and click on about link i need the url to be company.com/en/about so the urls still need to be dynamic for language support. In other words I want to be able to set the base url of any menu item no matter what component, where should I start? Thanks!"} {"_id": "1198", "title": "Subdomain versus subdirectory", "text": "I recently moved from having a main site like `www.example.com` with a subdomain associated site `yyy.example.com`, to moving everything off the subdomain to a subdirectory like `www.example.com/yyy/`, with the same hierarchy underneath the subdirectory, and now `yyy.example.com` is just a redirect page to the subdirectory, so `yyy.example.com/abc/page-x` is redirected to `www.example.com/yyy/abc/page-x` and so on. The effect has been that the traffic increased by several times (the total page views are more than ten times at the present rate) what it was to the `yyy` subdomain. Previously almost all of the traffic to the `yyy` subdomain was from the `www` site. This is from monitoring the old `yyy` site for a year or more. I'm curious to know if there's any research or results which would indicate whether this is universal (subdirectories beat subdomains) or \"just me\"."} {"_id": "68593", "title": "Do I get penalized for hosting a blog in a subfolder?", "text": "First of all, I know this is an ongoing debate, I\u2019ve read forums and seen different opinions but I\u2019m still confused. Here\u2019s my problem. I\u2019ve started my website from scratch (design and code) and now after a couple of months I realized that it\u2019s not very SEO friendly and my search traffic is close to none. As a result I decided to add a Wordpress blog to it as it\u2019s SEO friendly out of the box and has some good plugins like Yoast. My problem here is that I don\u2019t want my current website to affect the blog section (get lower search results on the blog because my website is poorly optimized). On the other hand, I already have a couple of backlinks to my current website because I keep releasing design freebies and people share those (would those links benefit my blog?). As I understood, if you make a blog on a subdomain, then google will treat it as a new website in which case the backlinks won\u2019t help me. If I choose to go with a subfolder then I may benefit from the backlinks but wouldn\u2019t I get penalized because of the existing website which is poorly optimized and maybe has a couple SEO issues that Google doesn\u2019t like?"} {"_id": "50269", "title": "Optimize SEO: 2 websites or 1 main website and subdomain?", "text": "I'm working on a WordPress website of a little company, let say: `www.xxx.com`. Now they want a different website for their workshops. What is the most optimal construction thinking of SEO? 1) `www.xxx.com` + `www.xxx-workshops.com` 2) `www.xxx.com` + `www.xxx.com/workshops` 3) `www.xxx.com` + `workshops.xxx.com`"} {"_id": "65539", "title": "SEO and multiple simple websites", "text": "From a **SEO point of view** , which one is \"better\" for a not so little company: **Having one huge site** containing everything regarding the company **or having multiple one?** I think one site is better, because it is easier to optimize and advertise, but I just started working in a company, where we have multiple simple websites (almost single page) to handle offer requests. My boss don't want this site structure changed. (These sites have no connection at all, but the first thing I would do is link them and by linking I mean adding a page where there are links to the other pages). It is easier to explain through an example. We have domains structured like this (not the real ones): * forkliftparts.com : for forklift parts * forkliftseat.com : for ordering seats * forklifttire.com : handling orders regarding tires * forkliftbattery.com : and for batteries * ... The sites are basically the same, with a little color difference. Should I start advertising the pages separately? What could I do to improve the standing in search engines? The sites are built with WordPress and with the use of SEO plugins, so I am more interested in **off-site seo**."} {"_id": "16468", "title": "Wordpress installation - Subdomain or as folder?", "text": "> **Possible Duplicate:** > Subdomain versus subdirectory I got a client portfolio website and he opens a Wordpress blog with articles related to his profession. What the best for SEO - install it as subdomain or as folder? www.myclientsite.com/blog/ or blog.myclientsite.com This is fresh domain without any previous content."} {"_id": "41003", "title": "SEO and location targeting , sub-domain or sub-directory?", "text": "> **Possible Duplicate:** > Subdomain versus subdirectory With regards to say a national service that has locations based in 10 cities. Each city needs to have individual content, such as blogs, info pages, etc. Is it better to have (or does it matter): `chicago.example.com` or `example.com/chicago` Furthermore if I go with a subdirectory and each city has a blog, how do I enable it so the blog only effects its parent. For example `example.com/chicago/blog` should not affect `example.com/washington`"} {"_id": "23733", "title": "example.com/blog vs blog.example.com", "text": "> **Possible Duplicate:** > Subdomain versus subdirectory I'm about to start my own blog (adding it to a domain already owned by me) and I'm wondering what is the best way to set it up. There are two common alternatives for blogs: `domain.com/blog` and `blog.domain.com`. My question is: what are the advantages and disadvantages and of each alternative and which one do you think is the best? (in terms of SEO, etc)"} {"_id": "56492", "title": "SEO: Is it ever good to use subdomains?", "text": "I just built a site that I split up as follows: www.MAINDOMAIN.com --> a landing page the redirects you to one of the following pages blog.MAINDOMAIN.com --> a blog restaurants.MAINDOMAIN.com --> a page that lists restaurants shops.MAINDOMAIN.com --> a page that lists shops Problem is, won't google see his as 4 seperate domains? i.e Would I not have been better off with something like this: www.MAINDOMAIN.com www.MAINDOMAIN.com/blog www.MAINDOMAIN.com/restaurants www.MAINDOMAIN.com/shop"} {"_id": "61729", "title": "Benefits of placing keywords in domain name", "text": "I'm now considering the site structure and I need to choose between productname.mysite.com and mysite.com/productname URL format. Question is the following: is there any ranking reasons to use first type of URL or the difference is behavior factor? If the difference is minor I'd choose the second URL type as it is simpler to handle (Google Analytics and Google Webmaster Tools are not really friendly to multi-domain sites)."} {"_id": "13586", "title": "Better SEO from sub-directory or sub-domain?", "text": "> **Possible Duplicate:** > Subdomain versus subdirectory I'm working on a site that will act as a sort of companion site or sub-site to my main site. Is there a difference in SEO of setting it up on a sub-domain or sub- directory?"} {"_id": "56182", "title": "For SEO purposes, should I put my blog on a webpage or a subdomain?", "text": "For example if I want to post under a webpage called \u201cblog\u201d, should I make that `example.com/blog/category/article` or `blog.example.com/category/article`, assuming that example.com has an about page, a music page, a merchandise page? What is best for SEO, or does it even make difference?"} {"_id": "22756", "title": "Keyword in sub-domain good or keyword in sub folder? Which is good to manage?", "text": "> **Possible Duplicate:** > Subdomain versus subdirectory Keyword in sub-domain good or keyword in sub folder keyword.abc.com or abc.com/keyword/ Which of this two is good to manage?"} {"_id": "2687", "title": "Priority of tags vs posts in a sitemap.xml file", "text": "I have a content based website and I'm in the process of creating a sitemap.xml file. I was wondering if it would be best to give high priority to tag urls or to post urls. Has anyone experimented with it ? What will give you better rankings in google and eventually bring in more visitors ? Edit: @John Conde I know that. However I assume that when someone searches for a term for which I have two pages of equal relevance priority may make a difference (at least in some search engines). What I want to find out is how important that suggestion is considered by search engines. To give you an example. I'm publishing a small CSS framework. Once every few months I give away a new release 0.1, 0.2 etc etc. All these posts have the keywords \"mainframe css\" in the title, url and post body. The same keywords are also found in the relevant tag page (site.com/tag/mainframe-css). I also have a \"static\" page that serves as a welcome page for the project, with links to all available downloads and stuff. So the way I see it, there are maybe 10 pages of equal relevance. However I want visitors to be taken to the project page. One solution I can think of is linking from every post page to the project page. Anyway thanks for the answers people. You got me covered :)"} {"_id": "54542", "title": "What do you call the effect when you gray out the background behind a dialog?", "text": "What do you call that effect? Looks like this ![enter image description here](http://i.stack.imgur.com/YqDmF.png)"} {"_id": "2683", "title": "Hosting speed tests?", "text": "Are there any sites to test the speed of your hosting? I know there are sites that will tell you why your site is slow, e.g. - YSlow, etc., but how about the speed of bandwidth you get from your hosting provider?"} {"_id": "24651", "title": "Best Practise for Micro-Sites that Google will not consider to be doorways", "text": "On behalf of my client, I have purchased a few really good URLs, to assist with their SEO. I want to understand whether my proposed use of these domains, as micro-sites to _encouraged_ visitors to visit the main site, will result in blacklisting. Here's my invented example: 'Farmaceuticals' Ltd. They sell big, purple Tractors. I would like to know whether is acceptable practise thesedays. If not, can something be done with micro-sites that is acceptable? Apart from the desired action of users clicking through to the main site, I want to ensure all other features are above-board, such as not using hidden keywords and focusing each site on one specific feature. Examples: * www.bigtractors.com: _\"Farmaceuticals Ltd sell some of the biggest tractors anywhere. Here's a link to the website: www.farmaceuticals.com\"_ * www.purpletractors.com: _\"Farmaceuticals Ltd sell some of the most purple tractors anywhere. Here's a link to the website: www.farmaceuticals.com\"_ What makes a micro-site a destination worth ranking, and not to be seen as a direct doorway to redirect visitors? One _extreme_ step could be to provide content that its completely unique to each micro-site, and does not appear on the main site. But, I wonder whether despite all efforts, having a link to take people elsewhere will make each micro-site appear as a 'doorway' whatever I do?"} {"_id": "14627", "title": "Understanding the EU Cookie Directive Rules?", "text": "I work on a service that has multiple EU clients. The service is hosted and maintained in the US. Before I even worry about compliance, I'm trying to figure out whether my service needs to comply at all. Is there an anything out there that explains the new EU cookie directive rules in plain English?"} {"_id": "14620", "title": "Will using social submitting/seo marketing tools harm my site in the long run?", "text": "I'm looking to drive more traffic to some blogs I maintain. We've had good success with submitting to sites like Reddit and Digg but the process is long and tedious with the amount of content we have. I saw products like Social Submitter and xgenseo that automatically create accounts (scarily xgen also creates dummy yahoo mail accounts and even solves the captcha) and submits articles for you. The tools seem like an extremely easy way of building up back links on relevant and important social sites without the tediousness of doing it manually. The only thing I'm worried about is that using tools like this will eventually harm my blogs traffic. Does anyone have any experience with such tools or opinions on the subject?"} {"_id": "31947", "title": "fsockopen() error : Network is unreachable port 43 in php", "text": "i've writed some Php code that lookup for domain (whois) but it fails !! this is some of my code : function checkdomain($server,$domain){ global $response; $connection = fsockopen($server,43); fputs($connection, \"domain \" . $domain . \"\\r\\n\"); while(!feof($connection)){ $response .= fgets($connection, 4096); } fclose($connection); } checkdomain(\"whois.crsnic.net\",\"www.example.com\"); the code work on my localhost ( apache,php,mysql, OS -> Win XP ) but when i uploaded it to my host (Linux) it failed. and i always see the Below Error/message : Warning: fsockopen() [function.fsockopen]: unable to connect to whois.crsnic.net:43 (Network is unreachable) in /home/hamid0011/public_html/whois/whois.php on line 37 what should i do ? is this my host's problem or whois server ( but it work in localhost ) or my code ? TNX"} {"_id": "63113", "title": "Why won't webmail receive emails from outside my domain?", "text": "I'm new to GoDaddy Linux cPanel web hosting. I've created multiple email accounts @my-domain using cPanel. These addresses can send mail successfully to each other and also to any other email account outside my domain (like Gmail). But all emails @my-domain can only receive email from each other not from outside my domain. That is, if I send an email from my Gmail to my @my-domain account it doesn't appear in my @my-domain account web mail inbox. Why is this happening and what's the solution?"} {"_id": "65225", "title": "What is the meaning of this sitemap error in Google Webmaster Tools?", "text": "In my Google Webmaster Tools, I received an error which I am not understanding: > When we tested a sample of the URLs from your Sitemap, we found that some of > the URLs were unreachable. Please check your webserver for possible > misconfiguration, as these errors may be caused by a server error (such as a > 5xx error) or a network error between Googlebot and your server. All > reachable URLs will still be submitted. I received this warning for a URL that is currently working perfectly. Can you please tell me if this a serious warning, or what it means?"} {"_id": "58567", "title": "Get IP range of \"Microsoft Corp\" network provider", "text": "I'd like to find the IP range of the \"Microsoft Corp\" network provider. How can I find this out? So far I've only been able to find this site which has a list of ip addresses, not necessarily an explicit range. There is a related Stack Overflow question, but doing a \"whois `157.56.237.102`\" on any whois site I've yet found doesn't give a range (have left a comment hoping for clarification)"} {"_id": "38644", "title": "Google Blogger Website CName and/or Text File Issues", "text": "I have a blogger Blog website and I would like to have it show up on my company website. I have read a couple articles out there on how to do it. A hand full of them talk about using FTP which is old and no longer available. However, I am trying to following along with this one: http://www.infinite42.com/small-business/integrate-blogger-blog-website Which seems pretty easy but I am having a problem getting Google to Verify the DNS CName or Text Record that I created on my Windows 2007 Server. Do I need to create this record at the registrar level. Right now the domain is setup at the registrar to point the www record to my server where on my server I tried the Txt Record and the CName Record with no luck in DNS. Here are the Google instructions for creating a CName file record in DNS: Follow the steps below to create a DNS (Domain Name System) record that proves to Google that you own the domain. Add the CNAME record below to the DNS configuration for abc.com. CNAME Label / Host: CNAME Destination / Target: Click Verify below. When Google finds this DNS record, we'll make you a verified owner of the domain. (Note: DNS changes may take some time. If we don't find the record immediately, we'll check for it periodically.) To stay verified, don't remove the DNS record, even after verification succeeds. Here is the link to do it with a CName: http://googlewebmastercentral.blogspot.com/2012/08/domain-verification-using- cname-records.html When I go to add my CName record on my server's DNS the only two fields available are Alias Name and Fully Qualified Domain Name. How am I suppose to create this record can someone please tell me? Thanks, Frank"} {"_id": "58218", "title": "AdSense: Will Google ban me if I list \"No ads\" as a reason to sign up at my site?", "text": "Can't find anything about this in Google AdSense's TOS. Basically I want to remove ads for logged in users on my site -- but will Google allow me to use this as a value proposition on my front page? > Sign up today for extra benefits: > > * No ads > * Personalized profile page > * Bla bla >"} {"_id": "62886", "title": "Homepage redirect and impact on SEO?", "text": "My website is basically a gallery of pictures that I display one at the time (as seperate posts). I have no need for a landing page and so I redirect all users who visit the homepage to a random post: <?php if (have_posts()) { query_posts('orderby=rand'); while (have_posts()) { the_post(); wp_redirect(get_permalink(), 302); } } ?> My homepage is still/and should be the most visited page and so I am wondering if the redirection will have any negative impact on SEO."} {"_id": "44642", "title": "Top commentators widget compatible with Facebook comments", "text": "I created a project on `freelancer.com` but then I was wondering if this thing is possible knowing Facebook comments has its own restrictions and limitations. \"Hi, I want a widget on the sidebar that displays the top commentators of the website. I am currently using Facebook comments. The widget should show something like the name of the top 3 commentator and the number of comments they left on the site...\" So I just want to know if something like this is doable with Facebook comments."} {"_id": "68286", "title": "Duplicate page content and the Google index", "text": "I have a static pages with dynamically expanding content that google is indexing. I also have deep links into virtually duplicate pages which will pre-expand the relevant section of content into the relevant section. It seems like Google is ignoring all my specialized pages and not putting them in the index. Even after going through web-masters tools, crawling and submitting them to the index manually. I also use the google API for integrating search on the site, and the deep linked pages won't show up. Is there a good solution for this?"} {"_id": "68289", "title": "How to stop browser from rejecting my downloads?", "text": "I have a portfolio site where I am trying to host some of my work, so people can download my work. Some of these files include exe executables, and some are `.jar` executables, which are run through batch. When a user tries to download my apps, it says that the file is not commonly downloaded and may be harmful, and therefore blocks the download. If I zip the folders, it still does the same thing. Any format I choose, still blocks the downloads. How can I stop chrome from doing this. Is there a way I can verify my files so they will be considered as trusted?"} {"_id": "58210", "title": "Do many users turn off cookies?", "text": "Just how generally prevalent is it that users have cookies disabled in their browsers? I want to set a cookie during a user's session so that all the pages know that the presence of a particular software program (required for certain functions on my site) has been detected. I can't really require cookies in this website, but cookies would be the easiest way to do what I need to do. So, do many users turn cookies off?"} {"_id": "58213", "title": "Why is my DKIM invalid?", "text": "When I use http://dkimcore.org/c/keycheck to test my DKIM it says This is not a good DKIM key record. You should fix the errors shown in red. DNS query failed for 'key1._domainkey.board67.com':NXDOMAIN A public-key (p=) is required But I do have a public key ![enter image description here](http://i.stack.imgur.com/1W3ON.png) Now I am getting this error: p= MIGfMA0GCSqGSIb3DQEBAQUAA4GNADCBiQKBgQCvjT4cF+/SV69t38ihp7TOMZ2m ruxwtcicE/DmuZJdMcHuEmN9iN03Q8wMdU4TFeirkF79nJMh4wDma1N2LGUiOZrv j1YtCpKZZRlV0IAr7MtA7jjaBEGNU5EsWYcZXriGhkyzl39TXdVIaBuuuqWBN9lk pD+jlA6zCM3nPL6b1wIDAQAB The p= field must be base64 encoded"} {"_id": "57390", "title": "Google doesn't show my site's results even for a direct search", "text": "> In order to show you the most relevant results, we have omitted some entries > very similar to the 21 already displayed. > > If you like, you can repeat the search with the omitted results included. When I do a direct search for my blog. This is very disturbing because I've created more than a dozen articles and none are shown in the SERPs for a direct search. The only way to view my pages seems to be on the third page when all the results are exhausted there's the text I've paste above and it's linked. Now clicking that link is the only way to view my pages. You can check this phenomenon out here: http://google.com/search?q=moviesreva.blogspot.com/"} {"_id": "95", "title": "How do I copyright my website?", "text": "My website has been in development for a long time and cost a lot of money. How do I go about securing an enforceable copyright on my website design and what is the proper way to post a copyright notice on the site so others know the work is copyrighted?"} {"_id": "38649", "title": "Feedburner is Displaying an Inactive Email Address When Logged In", "text": "I have set up a Feedburner subscription through my Google account login. For some reason Feedburner displays an old inactive email address in the top right when logged into Feedburner. I'm concerned since I don't have access to this email account. If for any reason I need to move my Feedburner Feed I'm concerned I'll need to access this email account which is impossible. My email address is correct in every other Google account, profile etc. and has been updated to a working and correct email address. How can I change this? Do I need to be concerned?"} {"_id": "58760", "title": "What images to use when using a post to a link", "text": "I want to be able to use some kind of images for when I put up a post to a link (story, article somewhere else on the web), but what's the rules for what images I can use? can I use an image from the source that I'm linking to? or do I have to completely use stock images?"} {"_id": "57366", "title": "Schema.org: Having a Product as \"about\" property", "text": "I want to define a Product as the main content of a webpage using Schema.org markup. My idea with the following HTML is to use the structure: - Webpage -- WebPageElement (mainContentOfPage of Webpage) --- Product (about of WebPageElement which is mainContentOfPage of Webpage) However, using this markup, Google does not seem to recognize the Product properties such as the aggregateRating. The \"structured markup tool\" is only satisfied if I remove the \"about\" property on Product. But then the structure breaks into: - Webpage -- WebPageElement (mainContentOfPage of Webpage) -- Product Product is no longer a part of WebPageElement. Even if I use the property \"mainContentOfPage\" directly on the Product node, I get the same result: The Product is not recognized properly. It seems like the Product node cannot have any itemprops. **So how should I proceed?** <body itemscope itemtype=\"http://schema.org/WebPage\"> <div itemscope itemtype=\"http://schema.org/WebPageElement\" itemprop=\"mainContentOfPage\"> <div itemprop=\"about\" itemscope itemtype=\"http://schema.org/Product\"> <h1 itemprop=\"name\">Acme Toaster Oven</h1> <div itemprop=\"description\">It toasts AND bakes.</div> <div itemprop=\"aggregateRating\" itemscope itemtype=\"http://schema.org/AggregateRating\">Rated <span itemprop=\"ratingValue\">3</span>/5 based on <span itemprop=\"reviewCount\">2</span> reviews</div> <div itemprop=\"review\" itemscope itemtype=\"http://schema.org/Review\"><span itemprop=\"name\">A great toaster</span> - by <span itemprop=\"author\">John</span>, <meta itemprop=\"datePublished\" content=\"2013-10-16\">October 26, 2013 <div itemprop=\"reviewRating\" itemscope itemtype=\"http://schema.org/Rating\"><span itemprop=\"ratingValue\">5</span>/5</div> <span itemprop=\"reviewBody\">First I had bread. Then I had toast. Magic!</span> </div> <div itemprop=\"review\" itemscope itemtype=\"http://schema.org/Review\"><span itemprop=\"name\">A small oven</span> - by <span itemprop=\"author\">Mary</span>, <meta itemprop=\"datePublished\" content=\"2013-10-16\">October 26, 2013 <div itemprop=\"reviewRating\" itemscope itemtype=\"http://schema.org/Rating\"><span itemprop=\"ratingValue\">1</span>/5</div> <span itemprop=\"reviewBody\">My 18-pound turkey wouldn't fit in this thing.</span> </div> </div> </div> </body> The HTML can be tested here: http://www.google.com/webmasters/tools/richsnippets ## Update After experimenting with \"itemref\", I got pretty good results with this code in Google's tool and Yandex and the Structured Data Linter. <body itemscope itemtype=\"http://schema.org/WebPage\"> <div itemscope itemtype=\"http://schema.org/WebPageElement\" itemprop=\"mainContentOfPage\"> <meta itemprop=\"about\" itemscope itemtype=\"http://schema.org/Product\" itemref=\"theProduct\" /> </div> <div id=\"theProduct\"> <h1 itemprop=\"name\">Acme Toaster Oven</h1> <div itemprop=\"description\">It toasts AND bakes.</div> <div itemprop=\"aggregateRating\" itemscope itemtype=\"http://schema.org/AggregateRating\">Rated <span itemprop=\"ratingValue\">3</span>/5 based on <span itemprop=\"reviewCount\">2</span> reviews</div> <div itemprop=\"review\" itemscope itemtype=\"http://schema.org/Review\"><span itemprop=\"name\">A great toaster</span> - by <span itemprop=\"author\">John</span>, <meta itemprop=\"datePublished\" content=\"2013-10-16\">October 26, 2013 <div itemprop=\"reviewRating\" itemscope itemtype=\"http://schema.org/Rating\"><span itemprop=\"ratingValue\">5</span>/5</div> <span itemprop=\"reviewBody\">First I had bread. Then I had toast. Magic!</span> </div> <div itemprop=\"review\" itemscope itemtype=\"http://schema.org/Review\"><span itemprop=\"name\">A small oven</span> - by <span itemprop=\"author\">Mary</span>, <meta itemprop=\"datePublished\" content=\"2013-10-16\">October 26, 2013 <div itemprop=\"reviewRating\" itemscope itemtype=\"http://schema.org/Rating\"><span itemprop=\"ratingValue\">1</span>/5</div> <span itemprop=\"reviewBody\">My 18-pound turkey wouldn't fit in this thing.</span> </div> </div> </div> </body> The only side effect is that the reviews and aggregateRating seem to connect to both the WebPage and the Product. I don't know if that's bad."} {"_id": "29386", "title": "Is there a domain search tool on the web that works well?", "text": "I would like to search for all available domains ending with a particular word. The best tool I've found for this so far is domainsbot, but it doesn't seem to work as it should (it will only give you the first 10 or so results for your criteria) Does anyone know of a good tool on the web for this type of search?"} {"_id": "57940", "title": "What is occurring in this .htaccess file in regards to rewriting to clean URLs?", "text": "I have the following code in my _.htaccess_ file, which is working fine and using \"pretty\" permalinks to reroute everything back to `/index.php`, and using `$_GET` values to serve customized content based which URL was requested. RewriteEngine on RewriteBase / RewriteCond %{REQUEST_FILENAME} !-d RewriteCond %{REQUEST_FILENAME} !-f RewriteRule ^([-\\w.]+)$ index.php?page=$1 [R=302,L,QSA] I'm just trying to get an understanding of how this code is working: I get that the two _RewriteCond's_ are there to avoid this behavior in the case that someone is requesting an actual file or directory. But my question involves how removing those two _RewriteCond's_ causes a different error. If they are removed, and I request `example.com/register`, is the browser is redirected to: `example.com/index.php?page=index.php`. I can't see how these two things are related, so I was just wondering if anyone could explain why this happens? (I have `R=302` set just for now so I can see where it's redirecting me to)."} {"_id": "61174", "title": "What schema.org should be used for User Profiles?", "text": "I might be taking this a little too far outside of the box but I'm curious about what Microdata from schema.org to use for a user-profile on my site. On the front page for instance (`http://www.findgamers.us`), a random gamer profile loads every time the page is loaded. Would I use the Person schema.org hierachy to make that structured data?"} {"_id": "52741", "title": "Can minification in Google PageSped service break Adsense?", "text": "A few days ago I applied for Adsense account. But it was rejected due website being down. Here is the full message: We did not approve your application for the reasons listed below. Issues: * Difficult site navigation * * * Further detail: Difficult site navigation: While reviewing your site, we found that it was down. Google ads may not be published on a site that is not fully launched, functioning, or easily navigable. Once your site is functioning and has enough content for our specialists to review, we will be happy to reconsider your application. If there is a typo in the URL submitted, you can resubmit your application with the correct site by following the directions below. I monitor uptime using Pingdom. The site was not down for a single minute during this period. In fact during last month it was down for only 2 minutes. I may have may have singled out the problem: I'm using Google **PageSpeed** service. This service has a **JavaScript minification** option. What it did was was strip the comment and `<!-- / -->` tags from Adsense code. So the Adsense code for my site which was originally like this: <script type=\"text/javascript\"><!-- google_ad_client = \"ca-pub-xxxxxxxxxxxxxxxx\"; /* Top Of Post */ google_ad_slot = \"4962340904\"; google_ad_width = 728; google_ad_height = 90; //--> </script> <script type=\"text/javascript\" src=\"http://pagead2.googlesyndication.com/pagead/show_ads.js\"> </script> looked like this: <script type=\"text/javascript\">google_ad_client=\"ca-pub-xxxxxxxxxxxxxxxx\";google_ad_slot=\"4962340904\";google_ad_width=728;google_ad_height=90;</script> <script type=\"text/javascript\" src=\"http://pagead2.googlesyndication.com/pagead/show_ads.js\"></script> I have disabled minification. Can someone confirm this was the problem ? And how should I proceed with reapplication ?"} {"_id": "13059", "title": "Back link report", "text": "Is there an application or site that can generate a back link report for a give web page? The sort of information I'd like to see in this report, for each back link, would be: 1. Page's Title 2. Page's PageRank 3. Page's URL"} {"_id": "36150", "title": "Jquery lie about size of download", "text": "When you go to jquery.com you see on top right \"Grab the latest version! Choose your compression level:\" its saying 32Kb for Production (compressed) version in reality when you download it its 91Kb. Go figure if they forgot to update it or want to mislead people who new to it into that its compact thing."} {"_id": "28569", "title": "Why would an IE8 in a desktop has a 'Tablet PC 2.0' in its user-agent string?", "text": "I am just curious, why would a windows 7 desktop, installed with ie8, have Tablet PC 2.0 in its user agent string. Mozilla/4.0 (compatible; MSIE 8.0; Windows NT 6.1; Trident/4.0; SLCC2; .NET CLR 2.0.50727; .NET CLR 3.5.30729; .NET CLR 3.0.30729; Media Center PC 6.0; .NET4.0C; Tablet PC 2.0) Mozilla/4.0 (compatible; MSIE 8.0; Windows NT 6.1; Trident/4.0; SLCC2; .NET CLR 2.0.50727; .NET CLR 3.5.30729; .NET CLR 3.0.30729; Media Center PC 6.0; .NET4.0C; Tablet PC 2.0) Is this a feature in Windows 7, how can I turn this off in IE8? Other browsers on the same computer don't have such string in the user-agent string they send. As a result, one of our web application confuses this particular **desktop** client as a mobile (because of the tablet), hence returns the mobile version of our website to it. Thank you!"} {"_id": "13053", "title": "ClickThrough on Google Webmaster Tool and Traffic Source in Google Analytics", "text": "I'm new to SEO and website management, but eager to learn. I manage a newly revamped site and I'm tracking it on Google Analytics and in Google Webmaster tools. The Webmaster tools show that I get about 3200 impressions and 180 click through's a week. Google Analytics show that no traffic comes from search engins, all of the traffic is direct. On average, I get about 60-80 visitors a day, shouldn't Google Analytics show at least a few of those visitors as having come from the search engines?. What does that discrepancy mean? I can't seem to wrap my mind around it... Thank you in advance, Svetlana"} {"_id": "13050", "title": "free tool to analyze competitors website", "text": "I am competing to get the highest rank search in Google for a particular keyword search. My site is now ranked at the third page. I would like to know what back links does the site that rank 1st is coming from, if possible what are the different strategy they have done. Is there a free tool that would allow me to do this? Any tips on what I need to do to proceed? I have done all the basic stuff, ranging from meta tags, title, h1, update your site regularly, keep it clean, relevant, useful, no errors on the page, submit sitemaps, social bookmarking... what else am I missing here?"} {"_id": "9", "title": "Who is a great domain registrar company?", "text": "I have used numerous domain registrars in the past, including Godaddy, 1&1, Host Monster, IX web Hosting, the list goes on. (not talking about hosting) I currently use GoDaddy, however, I'm not to happy with them. I need a great registrar! One who: * makes transferring easy * registration pain free, no up-selling * has great domain management * has amazing customer service Maybe such a company does not exist. I don't know. Do you know of one?"} {"_id": "10746", "title": "Domain Registration", "text": "> **Possible Duplicate:** > Who is a great domain registrar company? Hi, I'm about to make my first domain purchase, actually I'm going to buy in bulk ... Any idea about a good host and registrar? I`m looking for a good host with lots of features and not expensive."} {"_id": "46962", "title": "Suggestion for cheap but good domain-registrar for com, org or net?", "text": "Does anybody have recommendations for good - but cheap - domain-registrars for the com, org or net domain? I'm not after hosting, just registering a few domain-names (unless there's a _really_ good package-deal). Lower prices when registering multiple domains would be a plus. I'm after good personal experiences here... PS: I am going to use the domain, either host them myself or get a web-hotel later... it's not just to register and park indefinitely."} {"_id": "33851", "title": "PHP Image Gallery without javascript", "text": "I am looking for a simple PHP gallery with the following characteristics: * Pure PHP * HTML * CSS * SQL * Excludes JavaScript * Handle Subcategories * Easily customizable I have tried to search, but I find only with JavaScript or Ajax."} {"_id": "18701", "title": "Soft links over the Web?", "text": "Take this scenario. There is a server(say, A) at www.mysite.com .Now, there's a dynamic number of servers A knows about (knows the IP-adresses to). The problem is, if a user arrives at http://www.mysite.com/B, we need to redirect him to the server that the URI B will map to( assume that there exists a mapping from such URLs to the list of known servers, and that 'B' is valid here). Is there a way of doing this without redirection? (First deployment will be n GAE, so this redirection is not possible). What I want is the browser showing http://www.mysite.com/B, but rendering content as delivered by the server running on B. I think that this _may_ be possible, as, multiple webservers are indeed used within the same domain today."} {"_id": "54173", "title": "What is 'lack of original content'?", "text": "It is written everywhere that lack of original content is has a negative impact on ranking. But what is lack of original content? (I am not talking about duplicate content) I guess if you copy other site's content, this makes sense. But, assuming one develops its own functionalities, but similar functionalities are already available on other sites, is this considered lack of original content? Can Google decide to not index such pages (i.e., not give them a chance at all)? Are there other definition of 'lack of original content'?"} {"_id": "18707", "title": "Google Analytics \"Goals\" .. numbers seem inflated?", "text": "When I look at the Google Analytics goal report for yesterday, it shows that I had 3,000 conversions for a given goal - 74% completed, and 6% abandoned. * Where is the missing 20%? * My numbers show more like 500 conversions for this goal. Why are Google's numbers so different?"} {"_id": "33855", "title": "sitemap for non-CMS based website", "text": "I've just finished a new version of my website but now I don't know how to generate a correct sitemap. Now the website is based on `PHP` files and folder and in blog folder there is Wordpress. This is ho my tree structure looks like: index.php about.php contact.php support/index.php support/faq.php blog/ and so on... On Wordpress I can generate the sitemap automatically but how can I include the others link like `about.php` etc.? Thank you so much for your help!"} {"_id": "33500", "title": "How to optimize website for SEO when content is accessed via one search field", "text": "We are developing a website (sort of information index for companies) where almost all content is accessed via one search field (similar to Google): you type a search phrase, get results and then can click on the link that leads to actual information page of the company. These are the most important pages to be indexed. In company page, there are keywords (tags) that you can click on and get resuts for that keyword. Obviously, search engines won't enter all possible search variants to that search field to get links to the companies and I guess that keyword linking in the company info page is not enough. I think that submitting sitemaps to search engines should help a lot. But are there any other solutions to get content indexed on the website of this sort?"} {"_id": "18704", "title": "Return first image source from google images", "text": "Is there a way to fetch the first image source from google search if I have a search term? For example if input is `tomato`, ouput would be `http://www.cksinfo.com/clipart/food/fruits/tomatoes/tomato.png` Thanks!"} {"_id": "33859", "title": "Three-month freeze on domain transfers?", "text": "I am planning on hosting a web app on Google App Engine. I am planning on buying the domain from GoDaddy. Will I have to wait 3 months before I can transfer the domain from GoDaddy to GAE (or at least change the DNS settings to point to the GAE severs)?"} {"_id": "21730", "title": "Webmin install not successful on Rackspace CentOS VPS", "text": "I exactly followed Rackspace document here: http://www.rackspace.com/knowledge_center/index.php/CentOS_-_Webmin_1.470 to install webmin on CentOS. Every step is successful, except, in the end, I can't access webmin using the following link: https://50.57.109.50:10000 It's a fresh activated image. Have you tried to install Webmin on Rackspace CentOS VPS? * * * I followed the instruction here: http://www.webmin.com/firewall.html As you can see the port 10000 is open. I can't even deploy webmin on CentOS. Can you help? Thank you! * * * I use the http instead https, then it works~ I don't know why. really confusing~"} {"_id": "24360", "title": "Stop searches showing tld domain sites appearing as subdirectory of main holding site domain?", "text": "I'm not sure if this is a problem specific to how my hosting is structured or not. Let's say the main domain for my account is maindomain.com (for purpose of discussion) I am building other sites, e.g. anothersite.com and website.com The folder/directory structure on my hosting would therefore be: maindomain.com (root) + +---> anothersite.com (subfolder) + +---> website.com (subfolder) The domains are mapped such that they map to their respective folders. So for example, anothersite.com takes the visitor to the index.php in the folder anothersite.com However, in Google, as well as just search results for these sites and their content - anothersite.com and website.com (what I want), I also see results that include: maindomain.com/anothersite.com maindomain.com/website.com Obviously this is not the way I want search data for these sites to occur. I would like them to be omitted from the search results. How can the hosting be configured so that the results for each of these respective domains are separate, i.e. not concatenated, so if I search for items relating to these respective sites, I only get results for anothersite.com relevant to it, same for website.com and same for maindomain.com - also used itself. Such a solution to this problem should not damage the results that I do want to see. I'm aware of robots.txt but I'm not sure what I need to put in these to stop the mixed-up/concatenated searches without damaging the real results. Or even if all search engines obey them. Is there something I can also do with .htaccess ? As it is shared-hosting, I don't have complete control over how the folder/directories are structured for this hosting so I would have to stick with the structure but ideas as to how to get desirable search results are welcome."} {"_id": "49717", "title": "The effect in Google of the word \"the\" in a URL?", "text": "Keywords in your URL are extremely weighted, and I am aware of most bad practices, but the question as to whether the word \"the\" in a URL, as in `www.thewebdesigner.com`, is a bad idea or not is very hard to find information on. Searching has yielded next to nothing because the definite article is a special case in search algorithms. That is, you end up searching for the word \"the\" which is a bit too random. I'm wondering if this is a good idea, because as in the example case, \"web\" and \"designer\" together in a URL would theoretically be a fantastic step in solid SEO. However, if Google somehow reads it as a stop word or treats it differently, it might not be so smart. Can anyone tell me the effect of \"the\" at the start of a URL like `thewebdesigner.com`, and if this is good or bad practice, and a good or bad idea for a quality URL?"} {"_id": "17827", "title": "Would reducing the quality and/or size of images fall under fair use?", "text": "I was looking on flickchart tonight and was wondering how they legitimized the usage of movie posters on their site, as I figure that those posters would be copyrighted. However it's been online for a few years now, the posters are still there, so I figure they're considered fair use. What my question is would this line in their TOS under copyright complaints they have the following line explain it? > It is believed that the use of scaled-down, low-resolution forms of said > media to provide criticism or comment qualifies as fair use under United > States copyright law. So I guess my question is if that is actually valid, and if so, would the legislation be affected by the physical location of the servers that provide said content? I'm looking into projects that would utilize multiple images but the whole \"what is fair use\" issue has been both putting the projects on hold, and being a stubborn issue to resolve. Thanks for anyone that can help to shed some light on this issue."} {"_id": "17826", "title": "How Many Google +1's Does a Website need in order for Google WebMaster's Tools to Show Characteristics?", "text": "I have added the Google +1 Button to my website and discovered the new Social Activity section in Google WebMaster's Tools. Apparently, one of the interesting things you can learn about your audience is demographic data. But in GWT, the _Social Activity > Audience_ section for my site (currently 127 +1's), says the following: > Your site doesn\u2019t have enough +1\u2032s yet to show characteristics But I'm not sure how many +1's is enough. Google's official help page for the Audience section offers little insight: > The Audience page displays information about people who have +1'd your > pages, including the total number of unique users, their location, and their > age and gender. > > All information is anonymized; Google doesn't share personal information > about people who have +1\u2019d your pages. To protect privacy, Google won't > display age, gender, or location data unless a certain minimum number of > people have +1'd your content. But what is that \"certain minimum number\"? I've tried Googling this but all I could find to date was this page which doesn't answer the question. So how many +1's does a site need before GWT will show me audience demographic characteristics?"} {"_id": "49714", "title": "How to maintain and optimize SEO for multiple store locations with individual websites as subdirectories from a main landing page?", "text": "My question is how to maintain and optimize SEO when moving a site from the domain root to a subfolder, creating a second parallel subfolder, and having the domain root be a landing page that allows users to choose between the two subfolders. Hopefully a (fake) example will illustrate my question best. Imagine the website for a small local business that serves its surrounding neighborhood -- let's call it ACME Cleaners of Crown Heights. Crown Heights is the neighborhood. The URL is `www.acmecleaners.com`. This site has good SEO and does well for searches looking the cleaning services in the Crown Heights area. Now the business has expanded to a second location in a different neighborhood in the same city, let's call it East End. The new location is called ACME Cleaners of East End. As for the website, the homepage, `www.acmecleaners.com`, which used to be the homepage for the Crown Heights location, has been changed to a landing page that simply shows the ACME Cleaners logo and gives the user a choice between the two locations. The user is then taken to either `www.acmecleaners.com/crown-heights` or `www.acmecleaners.com/east-end`, which are individual WordPress sites for each location. The location is saved in a cookie and users are redirected to the individual location site next time they visit `www.acmecleaners.com` (with a 302 status code thanks to PHP's `header` function with the \"location:\" parameter). Specific questions: 1. What should I keep in mind to maintain the strong SEO that the Crown Heights location had now that its site is moving from `www.acmecleaners.com` to `www.acmecleaners.com/crown-heights`? 2. Specifically, how do I \"tell\" Google that the right URL for searches for \"cleaning in Crown Heights\" is now `www.acmecleaners.com/crown-heights`, not the domain root? 3. Likewise, what do I do to get searches for \"cleaning services in East End\" to show `www.acmecleaners.com/east-end`? Interestingly, without me doing anything, this search is already generating first-page results for pages within the subfolder, e.g. `www.acmecleaners.com/east-end/about-us`, but not for the location \"homepage\" which is simply `www.acmecleaners.com/east-end`. 4. What SEO should I do for the new landing page at `www.acmecleaners.com` to support good SEO for both store locations/website subfolders? 5. Any other do's and don't do's in this situation?"} {"_id": "17821", "title": "Linkpushing for SEO, real or fake?", "text": "While I was googling today, I noticed this site: http://linkpushing.net/ this ensure you to be pushed at the top of the google research's stack, by creating random reference to your sites on random blogs and/or articles. I can't believe that Google doesn't do anything against techniques like this, and I would like to know from someone more able than me on SEO subject if it's really possible to tease the google service in this way. And if you suggest to use this tecnique to my site."} {"_id": "54468", "title": "How to set preferred URL in Bing Webmaster Tools", "text": "My site is `www.example.com` and Bing Webmaster Tools keeping telling me to submit a sitemap for `example.com`, so it is possible to stop Bing stop bothering me the `example.com`?"} {"_id": "48264", "title": "Database for student Portal(A school website)", "text": "I am presently working on a school website, it is going to have student portal in it. Please, i need idea on how you think the database structure will look like. Its a university web, students are in different departments under different faculties. The students will also be offering different courses based on their department."} {"_id": "37842", "title": "excluding URL query parameters and setting the site search query param in Google Analytics", "text": "I have a Google Analytics free account and would like to know if I strip \"searchparam\" out of your URLs as the `site search query parameter`, do I then need to specify \"searchparam\" in the `Exclude URL Query Parameters` field under Profiles > Profile Settings? Or is this unnecessary?"} {"_id": "8243", "title": "Is it important to show the current year alongside my (c) Copyright symbol?", "text": "Should I update the copyright notice every January? as in > (C) ACME Corp 2011 I'm not asking about how or whether to claim copyright at all (that is dealt with here), but whether the **year** has any importance."} {"_id": "3777", "title": "Will having multiple domains improve my seo?", "text": "Lets say I have a domain already, for example www.automobile4u.com (not mine), with a website fully running and all. The title of my \"Website\" says: <title>Used cars - buy and sell your used cars here Also, lets say I have fully SEO the website so when people searching for the term **buy used cars** , I end up on the second or first page. Now, I want to end up higher, so I go to the google adwords page where you can check how many searches are made on specific terms. Lets say the term \"used cars\" has 20 million searches each month. Here comes the question, could I just go and buy that domain with the search terms adress, in this case **www.usedcars.com** and make a redirect to my original page, and this way when people search for \"used cars\", my newly bought domain name comes up redirecting people to my original website (www.automobile4u.com)? The reason I believe this benefits me is because it seems search engines first of all check website adresses matching the search, so the query \"used cars\" would automatically bring www.usedcars.com to the first result right? What are the downsides for this? I already know about google spiders not liking redirects, but there are many methods of redirecting... Is this a good idea generally?"} {"_id": "12592", "title": "How does SEO work when redirecting to a different site/domain", "text": "> **Possible Duplicate:** > Could I buy a domain name to increase traffic to my site like this? I have a site (e.g. www.foo.bar). When someone attempts to browse this site, they are redirected to a different site (e.g. www.test.com). I only reason of having foo.bar is because of the domain name. test.com is a 3rd party. I want people to come to foo.bar, so I can market on the domain name. If I am selling parts (e.g. Ford car parts). I want to configure SEO so that my domain name and \"Ford car parts\" are associated. The intention is when someone googles \"foo.bar\" and \"Ford car parts\" search engines start ranking my site/domain. foo.bar isnt the site housing the sale of the parts....the 3rd party is, but I would like to SEO woo.foo.bar if possible to accomplish my intention. Is what I am looking to do possible?"} {"_id": "57052", "title": "Domain Extensions - redirect from .com or .org to country side domain like .am", "text": "i was wondering one thing. Actually `.com` and all generic domain extensions specifies that the site is available for all countries while country side domain extensions like `.fm .mx or .sh` specifies to search engines that your site is developed for that specific country, What if i buy 2 domains like `thisismysite.org` and `site.am` and then i redirect `thisismysite.org` to `site.am` , will `site.am` be treated like a a general domain extension then? OR i'll need anyway a generic domain to say to search engines that my site is generic all country sides available?"} {"_id": "60166", "title": "2 domain pointing to single website with different meta description and title in SERP", "text": "Let's say I have one website `abc.com.my`, and now I want to create a new domain `xyz.my` but with its own meta description and meta title which points to `abc.com.my`. So I need to create one `index.html` inside the `xyz.my` domain then redirect to `abc.com.my`. So in SERP I hope to see `xyz.my` with its own meta description and meta title and `abc.com.my` with a different meta description and title but both point to `abc.com.my`. So will this affect my ranking or is it bad for SEO? ![expected search results](http://i.stack.imgur.com/7DNjA.jpg)"} {"_id": "30929", "title": "Is a good technique to buy abandoned domains to get link juice?", "text": "> **Possible Duplicate:** > Could I buy a domain name to increase traffic to my site like this? Yesterday I was discussing with a friend an idea he had. To promote your site about, for example, fashion (i.e. example.com), the idea is: 1. Find an expired domain (i.e. fashionsite.com) with a good amount of links from fashion websites (it is, a site that has been abandoned, but still keep a lot of inbound links). 2. Recreate the site structure (i.e. if you have inbound link that points to fashionsite.com/files/boots.html, you have to add a page that responds to this structure). 3. Add links from your new site (fashionsite.com) to the old one you want to promote (example.com) in order to give the link juice to your site. 4. Goto 1 until you cannot find more expired domains related to your site topic. This is a non-trivial process that could take a lot of time, so the question is: worth it? has anybody tried something like this? In fact, maybe that's a well known technique my friend and I didn't hear about it... Definitely I believe we haven't discovered a new technique :) NOTE: I've forget to explain that this idea is specifically to improve a site link building for Google, we are not interested in other search engines by the moment."} {"_id": "19386", "title": "Keywords in domain name", "text": "> **Possible Duplicate:** > Could I buy a domain name to increase traffic to my site like this? Let's assume we have a generic website \" **MyWebsiteAboutGamesAndThings.com** \". We discover that a lot of people are searching for \" **Make games** \". We find that the domain name **MakeGames.com** is free so we register it with the assumption that a keyword specific domain such as this will rank favorably for the search term we have discovered. We now have two options with the domain name: * We can 301 redirect **MakeGames.com** to **MyWebsiteAboutGamesAndThings.com** * Or we can make a simple webpage on it that intends to feed visitors to **MyWebsiteAboutGamesAndThings.com**. The reason for registering **MakeGames.com** was to target that specific search, however I have a feeling that you lose the benefit of the keywords in the domain by 301ing it. Is it better to make a sort of landing website for **MakeGames.com**? If you 301 the domain to **MyWebsiteAboutGamesAndThings.com** will it now rank better for the search **Make games**? Which is a better method?"} {"_id": "25367", "title": "Add-on domain for better SEO", "text": "> **Possible Duplicate:** > Could I buy a domain name to increase traffic to my site like this? So here's the thing. Suppose I have my domain called BUSINESS.COM that sells shoes. Now, I want people to find me through Google by searching \"best shoes\". So on my BUSINESS.COM site I will set all meta descriptions and titles to have a relationship with the keyword \"best shoes\". I was wondering, is there any way I can buy also the domain BEST-SHOES.COM and make it an add-on domain to my hosting and redirect it to NIKE.COM to improve SEO? I mean so when someone searches in Google for best shoes, my site will appear first because it has those keywords on the url. Or the only way to improve SEO is to make another website with no duplicate content in BEST-SHOES.COM that links very lightly to my site NIKE.COM? Like a site that reviews shoes."} {"_id": "39412", "title": "What is the best way to use multiple similar domains for a single business with SEO aspects?", "text": "> **Possible Duplicate:** > Will having multiple domains improve my seo? **Scenario:** There are around 15 freshly registered domains. All the domains are intended to support a single business activity. **Example:** * something-useful.com * something-extraordinary-useful.com * extraordinary-useful-something.com * useful-something.com * etc... What will be the best approach to use these domains to get benefit of SEO? Or to promote the business online? What are the possible approaches to use these domains?"} {"_id": "47709", "title": "Multiple domains", "text": "I got 6 domains : `company.ca` and `company.com` (because both where free, but we are a canadian company but can do business with the rest of the world). Then, we sell sportwear because of the company name is totally unknown to the world. Our product is we have bought product specific domain : `chandails.ca` and `t-shirt.ca` as well as `shorts.ca` and `shorts.com`. So those 6 domains are mine. Now what is the best way to do? Now all are 301 redirect to the main company name (`.com`) or make micro-site, super simple one page optimized for just shirt and one for shorts, then tell people to know more, go to the _main_ site. Because now, I cannot really find the benefit of the _search word in domain name_ edge if never somebody see something in that domain... I got confused and don't find strait answer to this question."} {"_id": "59005", "title": "Does SEO ranking based on domain apply to domains with forwarding?", "text": "Lets say I have a site at \"myrandomdomain.com\". I want to target a keyword \"my keyword\" and so I buy \"mykeyword.com\". Can I still have the keyword benefits of having mykeyword.com when it just forwards to myrandomdomain.com? The domain would not be masked. My guess would have been no but SEO Moz seems a bit unclear about it: > A 301 redirect is a permanent redirect which passes between 90-99% of link > juice (ranking power) to the redirected page. I'd just like to make sure before I switch my site over. If it would have the same effect to forward it I'll just do that."} {"_id": "24026", "title": "Will having multiple domains with redirects to my main domain have an impact on search rankings?", "text": "> **Possible Duplicate:** > Could I buy a domain name to increase traffic to my site like this? If I register 10 or so related domain names and 301-redirect them all to appropriate pages within my website will this hurt or help google ratings? Or will it be a neutral event? For example if I operated a daycare center would it be wise to purchase bostonchildcare.com, bostondaycare.com and bostonpreschool.com and then 301 redirect them to all to appropriate pages on my site? Thanks for your help with this!"} {"_id": "50680", "title": "Is it bad for SEO to forward multiple top level domains to just one domain?", "text": "For example, I registered `myName.com` and I also registered the `.net`, `.info`, and `.org` extensions for `myName`. I'm wondering if it will hurt my SEO to forward these other TLDs to `myName.com`?"} {"_id": "49712", "title": "Is it okay to redirect all mobile users to a mobile coming soon page?", "text": "I recently read this article on google webmasters blog which said that we should avoid redirection of users on mobile to one single page as it interrupts their flow. Now I have a site on which I have a mobile coming soon page to which a user is redirected the first time he/she visits the site. Once a user clicks \"view web version\" on this page a cookie is set and he/she is taken to the homepage. Now if the user comes again on my site he/she would be taken to that respective page and wouldn't be redirected. Can this have SEO penalties? I am not interrupting the user-flow every-time and hence should I bother much? Should I scrap this all together or alternatively I could redirect the user to the page they came to when they click \"view web version\" instead of the homepage?"} {"_id": "374", "title": "URLs: Should I use hyphens, underscores or plus symbols?", "text": "**Which is better for search engines?** 1. `example.com/my_cool_page.html` 2. `example.com/my-cool-page.html` 3. `example.com/my+cool+page.html`"} {"_id": "49958", "title": "URL encoding - what difference to SEO? + or _ or -", "text": "I have a URL e.g.: http://mysite.com/tracks/backing-track/101153/ABBA/Super+Trouper **QUESTION** does it make a difference to SEO when using `+` or `-` or `_` in the URL between words? Is one more readable and indexable than the others?"} {"_id": "11773", "title": "SEO - File naming", "text": "> **Possible Duplicate:** > Should I use hyphens or underscores? Hi For the file name \"mission-and-vision.htm\" or \"mission_and_vision.htm\", which one is good for SEO(separator \"-\" or \"_\")? Thanks"} {"_id": "20206", "title": "How To Track Down and Stop Rogue Bots?", "text": "**Most of the bandwidth of one site is being consumed by an unidentified bot**. According to AWSTATS it says: Unknown robot (identified by 'bot*') consumed 164 GB this month. By comparison, Googlebot consumed 10 GB and visitors (viewed traffic) consumed 25 GB. This means rogue bots are consumming over 6X the bandwidth of visitors. For other sites which I run (about a dozen) the **normal ratio is 25%** , so for 25GB of viewed traffic, bots take about 6GB in TOTAL. The question therefore is: **How to identify which bot(s) are causing this huge amount of request and how to stop them or slow them down if they are useful?** Obviously, most bots that visit the site are important including the Googlebot, Yahoo Slurp, MSNBot, etc, including the AdSense/DoubleClick bots, so I cannot simply block all bots. _The reason I am investigating this is that I am reaching the limit of bandwidth and exceeded CPU usage for my host, so I was sent a notice._"} {"_id": "20255", "title": "Moving a Domain Away from Google Sites but Not Apps", "text": "First, I looked for a question like this but couldn't find one. There's a lot of people who want to move a domain to point at google sites - I actually want to move it away from google sites and point it at a hosted server. I was able to figure out how to do this but then it broke my client's calendar and email. I created matching email accounts on the new server but it doesn't fix his contact list and calendar stuff which use to be synced with google. So my question is: can I move a domain away from google sites and point it at a hosted website without breaking the email and calendar? If the answer is no then we could always use two separate domains: one for email/calendar and one for website, but that doesn't seem like the perfect solution since really his email should be at the same domain as his website. Will he just need to use a different address to sync his calendar and contacts? I appreciate any help. Thanks!"} {"_id": "34604", "title": "Can I place a directory that lists other companies contact details on my commercial website?", "text": "Can I place a directory that lists other companies contact details and other relavent information on my commercial website? We are a magazine site and this would be placed for an additional resource for readers. We have no legal connection with the corporations we would be listing."} {"_id": "52575", "title": "Is it possible to block access to a DB based on IP?", "text": "We're looking to beef up the security on a site, among other things we thought of trying to block access to the DB based on IP (in a similar way you can do a FTP lock dependent on IP), so that only the static IP of our VPS can access it as well as our local office IP (also static). Is this a standard approach, I tried searching for it to see any notes on implementation / best practices, but I could see anything written about it?"} {"_id": "29768", "title": "Open source Rails/Sinatra/Ruby web mail client?", "text": "Is anyone aware of any? I'm specifically looking for any that are well maintained (in the Ruby community that generally means the source is availble on GitHub). Here's what I've found in my research: ### MailCatcher MailCatcher runs a super simple SMTP server which catches any message sent to it to display in a web interface. * official site * source code ### Sup Sup is a console-based email client for people with a lot of email. It supports tagging, very fast full-text search, automatic contact- list management, custom code insertion via a hook system, and more. If you're the type of person who treats email as an extension of your long-term memory, Sup is for you. * official site * source code ### ActionMailer * source code"} {"_id": "14031", "title": "One page website - Effect on SEO", "text": "I'm considering re-designing my personal website to promote my web services in my local area. My issue is whether to do a one page website with a scroll-to navigation system. What effect does this have on SEO. As i understand it, you set stuff like h1's and meta description on specific page content. How does this work when all of your different content is on one page?"} {"_id": "34774", "title": "What about SEO in one page website with ajax loaded content?", "text": "> **Possible Duplicate:** > One page portfolio - Effect on SEO As my title I'd like to build a website with just one input text for searching restaurants and I would like to load via ajax in the same page the resultst in a list. After the list is loaded if you click on one row for Restaurant details it load via ajax all the Restaurant details. what about SEO in a website structure like this? There is a way to index every single restaurant? I'm pretty new in SEO and every comment will be for sure important to me in order to understand and learn more about it. Cheers"} {"_id": "56953", "title": "SEO best practices for one page websites?", "text": "I build many one page websites that dynamically load content through AJAX/jQuery. I keep reading how bad one pagers are for SEO. Regardless, how can you get the most SEO results out of a one page website, without splitting it up into multiple pages?"} {"_id": "67103", "title": "Restrictions on transfer of .US domain to non-US citizen?", "text": "Four years ago an American friend of mine purchased a `.us` domain for me as a wedding present. A year later, when the domain renewal came up, she simply transferred the domain to me, and it has been registered, with my valid UK details, ever since. On Wikipedia it lists the following under ' **Restrictions on use of .us domains** ': Under .US nexus requirements .US domains may be registered only by the following qualified entities: - Any United States citizen or resident, - Any United States entity, such as organizations or corporations, - Any foreign entity or organization with a bona fide presence in the United States The _heading_ implies that there restrictions on the _use_ of the domain, yet the section itself only refers to the _registration_ of the domain, and says nothing about its use. Since it was originally registered by a U.S. citizen and then transferred to me (a UK citizen) later, is this still an issue? Am I breaking ICANN rules? The main reason I ask is because the domain is set to become the basis of a new business venture, and I don't want to have the rug pulled out from under me."} {"_id": "29816", "title": "How can I show embedded images in HTML e-mails sent from my website in Hotmail?", "text": "When I send HTML e-mails from my website with my logo as an embedded image, Hotmail hides it by default with the message `Attachments, pictures and links in this message have been blocked for your safety.` Since this is my logo, it makes the e-mail look very funny. Is there some certification/verification process to allow embedded images from my domain to be shown by default in Hotmail and other major e-mail service providers?"} {"_id": "8487", "title": "How to Remove URLs from Google Search Engine", "text": "I have used so many things like url removal request and deleted the pages etc., but still the errors are shown in my webmaster tools. How can I remove the urls completely from Google?"} {"_id": "32289", "title": "Remove internal sub domain duplicate content on Google", "text": "> **Possible Duplicate:** > How to Remove URLs from Google Search Engine _Our hosting provider has a internal sub domain like so:_ `n265-269-265-265.cnet.hosting.iiii`. The internal sub domain is redirected to our registered domain and is publicly available. Google indexed the internal domain content and now it is appearing on search pages. We'd like it if Google removed the internal sub domain content and we would like to setup a redirect to the main domain. Our hosting provider is actually a sub-provider and as far as we know it does not fully control the main hosting domain and I am stuck in a discussion with them about what can be done. I would like a solution to this issue."} {"_id": "60612", "title": "how to remove the old urls from google search and replace new urls?", "text": "I know this question has been asked here, and here before , But I have problem with answers, so here is mine : I have changed my urls structure ( not my domain ), and I want google to **remove** the old urls and replace the new ones, and I can't make redirection because I dont know what is it exactly ( its very dynamic ), How can I do this ?"} {"_id": "29813", "title": "How to set suhosin.post.max_vars in a htaccess file", "text": "I need to add this values to my htaccess: suhosin.request.max_vars = 8000 suhosin.post.max_vars = 8000 If I just add them like that I get an **Internal Server Error** Which is the right way to include them? ![enter image description here](http://i.stack.imgur.com/zkXL9.png)"} {"_id": "29760", "title": "How many visitors block adverts?", "text": "On average, what percentage of visitors block adverts, for example by running AdBlock? (The average obviously depends on the population, which is worth specifying, but any answer's better than none...)"} {"_id": "44429", "title": "How to handle multiple pages of the same site with the same outlinks", "text": "I am developing a back link tool for Chinese SEO (our web site URL is: > `http://link.aizhan.com` just like ahrefs.com. I encountered a problem which is how to handle multiple pages of the same site with the same out links. For example: Most pages of `bbs.chinaz.com` have the same out links such as: bbs.chinaz.com/Tea/thread-6293993-1-1.html bbs.chinaz.com/Tea/list-1.html bbs.chinaz.com/alimama/thread-6265032-1-1.html bbs.chinaz.com/alimama/thread-6265032-2-1.html?userid=-1&extParms= bbs.chinaz.com/Shuiba/list-1.html bbs.chinaz.com/FeedBack/thread-4456753-1-1.html etc.. All of the pages have the same out links in the top of the page: www.cnzz.com(anchor text:\u514d\u8d39\u7edf\u8ba1) www.313.com(\u5efa\u7ad9\u5927\u5e08) www.idc123.com\uff08\u4e3b\u673a\u4e4b\u5bb6\uff09 Suppose I store these outlinks into database. The SEO will find there are six backlinks from `bbs.chinaz.com` of `www.cnzz.com`. This is obviously no sense for the SEO. Can you tell me how do you deal with this problem?"} {"_id": "22540", "title": "SEO and URL Shorteners", "text": "When you have a link from lets say twitter that is `t.co/foo` that links back to your website how does that effect SEO and how do the major search engines handle Shortened URLs?"} {"_id": "59417", "title": "Do URL sortneres make an impact on search engine rankings?", "text": "Can anyone explain me what does the following mean (from thesearchenginepros): The best URL shorteners will use a 301 redirect to pass along their Google page rank to the destination web address. This helps the redirected web address advance in search rankings. I was going through the advantages of shortened url, when I came across this."} {"_id": "66914", "title": "Do short URLs like goo.gl, bitly etc. affect your website ranking?", "text": "I need to know whether using short URLs to back-link to pages affects your website ranking in the Google search results."} {"_id": "65031", "title": "Adding a new site to a Google Custom Search Engine gives fewer hits", "text": "I have a Google Custom Search Engine (CSE) with one \"site\". Adding a second site gives me fewer hits. Removing the second site gives me back the hits I had with only one site. What is going on there? I add the sites with XML to be sure of what I am doing, like below: **UPDATE: Here are instructions for anyone who wants to test:** 1. Add these two sites to a new custom search engine (https://www.google.com/cse/): www.zotero.org/groups/from_some_psychologists/items/itemKey/* www.zotero.org/groups/minding_my_mitochondria_terry_wahls/items/itemKey/* 2. Search for `food` (you will get 2 hits) 3. Now delete the first site from the custom search engine. 4. Search for `food` again (you will now get 19 hits) **UPDATE 2:** I posted the question also to Google Product Forums > Google Custom Search: https://productforums.google.com/forum/#!category-topic/customsearch/indexing- and-results/8Qs60jdiFZQ **UPDATE 3:** I previously thought that the hits when having both sites were always only from the second site. That was wrong (so I removed the part saying so)."} {"_id": "1173", "title": "What type of websites shouldn't use a cache?", "text": "Is there ever a time when you should use caching? Like for frequently updated pages? Or, should you always cache? Please give examples and reason?"} {"_id": "1172", "title": "How do you enhance your websites speed without compromising the design and access?", "text": "How do you enhance your websites load speed without killing the design and accessibility? File compression, CDN, Gzip? What are the best tools for doing so? For example, Google has optimized their site without compromising the design. Also, many website can kill the purity of their images with compression. Is there a way, more or lest best practice, to increase speed without compromising the design and accessibility? **Note** : sorry for being so vague but I don't know how else to phrase this question."} {"_id": "1171", "title": "Transferring site from one server to another - good tools?", "text": "I'm not a webmaster - I'm a generalist with a strong bias towards development of applications and databases, but currently I'm helping a client I've consulted with before, and they want to move some files from one hosting provider to another. They are moving more and more stuff to cloud hosting to be able to handle scalability - they get a lot of hits when big events happen and people want their data. Their regular web guy has kind of flaked out on them. All static content (data files, reports, etc), might need to translate some links, add some Google analytics, etc. But what is a good tool for performing the transfer initially - to sync up the new site to the old site. before fixing up the broken links etc? I'm pretty sure they'll get me FTP access to the old and new sites. Either Mac or PC, whatever's best."} {"_id": "23507", "title": "short and friendly URL for affiliate links", "text": "This is my first question here on Pro Webmaster... I'm really glad to be here! I believe this question will have a simple answer but I'm starting with htaccess and I want to learn something about this file and its rules. I have a long and ugly affiliate link but I want to create a 301 redirect to this URL, without typing it. My website is http://www.matthewlabs.com/ and I want to use a link like http://www.matthewlabs.com/wishonlist/appstore which automatically redirects to my affiliate link. I tried to write this in my htaccess file: Redirect 301 /wishonlist/appstore http://myAffiliateLinkHere but it doesn't work. Must the directory /wishonlist/appstore exist to redirect? Because now I receive a 404 error... And to use this link, can I create a simple link with html a tag? Click Thank you so much for your time and I hope you can help me!"} {"_id": "23051", "title": "Can I change title and meta description with 301 redirect?", "text": "I would like to make 301 redirect from one page to another. Do I need to use the same titles and meta tags or I can change it to new one without losing the rank of the page?"} {"_id": "22959", "title": "What could be preventing Google Chrome developer tools' audit from completing?", "text": "I'm trying to run Chrome developer tools' audit on http://www.zmxmusic.com, and it never completes... just gets up to about \"Loading (61 of 61)\" with the spinner but never shows the results. If I check the Network tab, there are no requests pending; all requests are complete and without errors. Audit works fine on other sites. **EDIT** Added actual site URL."} {"_id": "47212", "title": "Server won't respond to customer - Wordpress site / Hostgator - high volume", "text": "I've got a customer using WordPress. They have been using it for years. It's a school where the kids get on each day and hit the site A LOT. Suddenly, in the last 3 days they get NOTHING back from the server (this ONLY occurs from within the school). If we do a `tracert`, we get out and seemingly get to the server. But if I look at network traffic on Chrome -- I never receive a response. This same thing happens on IE and FF. A look at CPanel shows that no IPs are being blocked. I've never done IP blocking so am not sure if this is what one who is blocked would even expect to see. We are a reseller for Hostgator... I wonder if something higher up the chain (maybe on their end) is causing this. I've got a ticket in with them. THIS JUST IN: I found out that the customer can reach the site when they skip their firewall... this leads me to believe that it HAS to be their firewall (even if their firewall provider says it's not)."} {"_id": "48028", "title": "Multiple webpages with the same page titles/meta-tags", "text": "I have created several duplicate pages because my website is in 3 languages (yes I know this is bad way to achieve this). Will Google flag my site as spam if I use duplicate page titles/meta keywords? Or would other problems develop?"} {"_id": "27619", "title": "GEO Tool - commercial use", "text": "what i want to do: offer my clients the possibility to display their store as a small graphic (like google maps). just a small png with the location of the store. only the specific client is able to see this images and he pays for this area (not only for that :) ). So i'm looking for a api or whatever else to geocode(?) the adress and render a small (280*160px) image. google offers a premium licence (not cheap) ms bing openstreetmap - i'm not able to understand the 10^10 licences ;( any ideas?"} {"_id": "27617", "title": "How do websites with no content rank so high?", "text": "I was just searching for, how to close a sim and I got the website (shown below) on the first page of Google. The website does not have the solution to the problem. there is just a line written about it but the SEO or a trick is played so well that it has got so many real comments. The comments are users asking him to help me close their sims. Now I don't get this. There is no content then how does the site rank so high? URL: `boltaconsumer.com/complaints/zong-sim-block` ![serps](http://i.stack.imgur.com/iyHit.png)"} {"_id": "38442", "title": "Set secondary receiver in PayPal Chained Payment after the initial transaction", "text": "I'm running a service whereby customers seek the services of 'freelancers' through our web platform. The customer will make a 'bid' which is immediately taken from their accounts as security. Once the job is completed, the customer marks it as accepted and the bid gets distributed to the freelancer(s) as a reward. After initially storing these rewards in the accounts of the freelancers and relying on MassPay to sort out paying them later, I realised that your business needs to be turning over at least \u00a35000/month before MassPay is switched on. Instead, I was referred to Delayed Chained Payments in PayPal's Adaptive Payments API. This allows the customer to pay the primary receiver (my business) before the payment is later triggered to be sent to the secondary receivers (the freelancers). However, at the time that the customer initiates this transaction, you must understand that nobody yet knows who will receive the reward. So, before I program this whole Adaptive Payments system, is it even _possible_ to change or add the secondary receivers after the customer has paid? If not, what can I do?"} {"_id": "27615", "title": "Why has Apache begun running a single process as root, and another as root and the wwwdata user?", "text": "I am running Apache Worker-MPM, PHP 5.3.2, and FCGID. Apache runs better than before, but I noticed that there is an apache process running as root and a second process that runs as both root and as wwwdata. The rest of the processes are running as wwwdata. Has anyone run into this yet and is it safe?"} {"_id": "64889", "title": "How to make google index press releases or new items as they are released", "text": "I have noticed that several website news items are indexed immediately by Google. Even questions posted on stackexchange.com are indexed by Google within minutes. Can someone tell me how i can do that with the website which has frequent news items on website. How can i achieve this with otherwise with has frequent news or press releases posted."} {"_id": "27611", "title": "Google Analytics site search", "text": "I am trying to enable the internal site seach tracking with Google Analytics. So at analytics page, account `mysite.com`: Profile Settings => Site Search Settings, do track enabled. At my site when someone make a search for test, URL looks like: `mysite.com/search?keyword=test`. At the query parameter I added `search?keyword=` but I haven't seen any results so far, any idea how to fix this?"} {"_id": "59039", "title": "Chrome fails to view a html file", "text": "I have set up a website which is visible from FF but not from Safari nor Chrome. I thought it is because of faulty WordPress install but obviously it is not because I have added a test.html in the root, FF views it but Chrome gives an error. Never happened before, tried Google but found generally app or html problems causing the issue."} {"_id": "39592", "title": "Possible Fraudulent Registration of Company Domains", "text": "Currently we have the domain name `company.com` (I'm using 'company' rather than our name for anonymity). I'm getting ready to purchase several more like `*.org`, `*.net`, etc to prevent others from registering them and to avoid confusion for our users. Now I have heard that a China company is considering registering several domains like: company.asia company.cn company.com.cn company.com.tw company.hk company.in company.net.cn company.org.cn company.tw A domain registration broker contacted me with this information, so I'm a bit concerned it is a scam. My thought is that this broker contacted me to try to sell these domains to me, and there isn't a \"company\" out there wanting to buy them. So based on the fact that I have no trade marked name and nor do I have any patents protecting this name and I have been using it since 2004, what is the best course of action? I should note that I have resellers of our products in China, Taiwan and Japan, so I could imagine having those domains and that being a good idea. But this is new to me so I'm not sure what to do about it. Any thoughts?"} {"_id": "39591", "title": "Pull Google Analytics source info into email", "text": "I am trying to find out whether it is possible to pull Google Analytics source info (direct/organic/referrer etc) and include that in the email that is sent to us when someone enquires on our website. We have a large account management team for an incredibly high average order value product, currently when someone makes an enquiry on our website, an email is sent to all our account managers with the contact details of that person for them to follow up accordingly - there is no automatic integration into our CRM system. To minimize the margin for human error (and having to ask each prospective client where they heard of us) I'd like to include as much of the source information as possible in the email that is sent to our account managers. Is this possible? If you have experience and are willing to take this on as a project please send me a private message."} {"_id": "59032", "title": "Does image mime-type affect SEO?", "text": "I have uploaded images as mime-type application/octet-stream, they are served well if sourced from an image tag but Chrome for example cannot open them directly, always asks for download. The question is, does the wrong mime-type affect findability and SEO?"} {"_id": "59033", "title": "Async Adsense Ads works with Ajax, But is it against TOS", "text": "Google's Async adsense code loads fine even the page is loaded via ajax. I made no changes to load the ads differently. Just added https://github.com/browserstate/ajaxify on my site to make the content loading ajaxified. My adsense loads fine also. Does it violate any TOS."} {"_id": "59030", "title": "Is there anything I can do if I found someone copying my content and spreading on FB/Twitter?", "text": "I made a website and it somehow became very popular last year. Then later today I found someone just stole my content and spread it in Twitter and Facebook. And many users are sharing that fake website. I am very upset but I don't know what to do. I reported to Google but even if Google removes it, it's still spreading on Facebook & Twitter. Is it possible to contact Twitter & Facebook to stop this? Thanks!"} {"_id": "65175", "title": "Hiding duplicate content - should I use meta tags or robots.txt?", "text": "Last week my site was utterly pummeled in Google's rankings - losing 95% of impressions overnight according to Google Webmaster Tools. It now only shows up if you search for the URL/site name itself. I've not engaged in any shady link-building (or indeed any link-building at all!) and the site is technically fine (fast pages, no malware, fully responsive). So my first guess is that I'm being penalized for duplicate content. Although there's a huge amount of rich content on the site, there's also a lot of algorithmically-generated pages - for example, one for each town in the UK. (This isn't done for SEO, but I guess maybe Google thinks it is.) So I need to stop Googlebot finding, and objecting to, this content. I would rather do it via a meta tag (``) on the relevant pages than with using _robots.txt_. So my question are: * Is a meta tag a workable alternative to using _robots.txt_ for hiding \"problematic\" content? * Is there anything else I should be doing?"} {"_id": "59034", "title": "How to have Gitlab display a Markdown document as source?", "text": "For our current project, we have Markdown source files which aren't acting as project documentation (such as a `README.md`) but rather as the source of the project itself. It would be very helpful if I could instruct Gitlab that these files are not to be rendered into HTML but displayed as raw source files. The source files are intended to be processed by pandoc, so a lot of the formatting gets clumsily misinterpreted by Gitlab. Is there a setting to turn rendering off for specific files?"} {"_id": "39598", "title": "Is it safe to use multiple h2 tags for a list of products?", "text": "I am using html5. I have a list of products in my store (132 products), and they are currently being displayed using pagination, with 10 results on each page. All my product names are very keyword rich and I want to get all that long tail traffic, but how do I assign them importance (using header tags) without getting penalized? For example, if I use `

` for my product names (they are not links, just plain text), then the bots will see 132 `

` tags, and I think this may be a problem. Any ideas?"} {"_id": "32621", "title": "How can I tell whether a given file is being used on my site?", "text": "I'm administering a site that uses a CMS, and the only access I have to the filesystem is through ajaxplorer. I'd like to remove pictures, documents, etc. that aren't being used or linked to. Is there a way to tell what files on the site are being used, other than scraping the site and manually comparing file lists?"} {"_id": "46753", "title": "How to change a URL using .htaccess", "text": "I don't know anything about coding, but I really want to change the URL of my pages: * `site/data/9/Symbian-3rd-Apps.html` to `site/symbian_applications.html` * `site/data/file/352/Core-Video-Player-Symbian-3rd--sisx.html` to `site/Core-Video-Player-Symbian-3rd--sisx.html`"} {"_id": "31234", "title": "How google handle site traffic in google analytics", "text": "I have a site with address www.exam.com and I have put Google analytics javascript scripts in it. I have made an app for my site, I want that everytime a user uses app, he visit the site in the application with built in browser which is inside the application ( I am using C# for application and .NET web browser ). User will address www.example.com/appvisit in the app and I just have put google analytics scripts in that page and nothing else. And I want to disallow this address /appvisit in my robots.txt file . I want to know that Is there any problem with doing this? will google crawl in the /appvisit directory ? Does google hate this work? and will google think this traffic is true and normal? thanks"} {"_id": "7180", "title": "Adsense alternative for a \"Sex Education\" website?", "text": "I am creating a nice and niche \"Sex Education\" website. No porn, nothing offensive and no scams. I would love to place Google Adsense but they do not allow ads on adult sites. I would like to know what advertising and link-exchange like should I place on my site. My sole objective is to cover the server costs and salary of one or two persons. _(In this way, it is different fromBest alternative to Adsense for a small website?)._"} {"_id": "31238", "title": "Web hosting does not matter?", "text": "I have read this question. I agree that webhosting does matter in load time and how much uptime it is giving but it seems that some companies have given it a huge hype. Please have a look at this example, ![enter image description here](http://i.stack.imgur.com/50xAO.jpg) They claim that in SEO, 23.8% is which hosting you use. Please clear out is it a gimmick or it carries some weight. Thanks"} {"_id": "47814", "title": "How can I delete all Wiki pages created by spammers except mine in MediaWiki?", "text": "I Implemented a MediaWiki site but unfortunately somebody created thousands of pages through thousand of users. Which query could I run for deleting all pages created by these users except mine? MediaWiki pages are in `wiki_page` and `wiki_text` inside MediaWiki database but there is no field about who created those pages. Largest table: * `wiki_text`: 682MB * `wiki_externallinks`: 162.5 MB * `wiki_recentchange`: 95 MB * `wiki_page`: 43 MB Here, I find a possible answer to my problem but I'm blocked at the first step: \"Export articles created by you (presumably logged in as the WikiSysop user or similar)\" How can I export article made by me if in the table there is no field about user who created pages?"} {"_id": "29105", "title": "Using style.less from lesscss.org doesn't properly show changes I make to the style.less file?", "text": "I inherited a wordpress project that was initially developed by another developer and for the CSS they utilize the script from: http://lesscss.org (which so far seems to not only be overkill but horrible if the user doesn't have javascript enabled on their browser) My guess is that it is caching but even in safari if I disable Cache, it doesn't update with my changes. Safari loaded with some changes, but when I made more changes it only shows the same time from the first page view. Same thing in firefox except firefox doesn't show any of the new changes. This only applies to css changes made in the style.less file. If I make any changing to the html code then every loads correctly. I even tried appending #!watch\u2019 to the url like the web page said, but still no luck. Any idea on what I can do to make the changes load correctly?"} {"_id": "7186", "title": "Nice wordpress to wordpress redirect?", "text": "I have a WordPress blog in http://suportrecerca.barcelonamedia.org/blog/ , and since I can no longer use our company servers for my blog I've had to move it to blog.joanmarcriera.es Google had my old blog well indexed and many people is landing to my old blog. I want to redirect this people to my new blog in a nice way, like a 5 seconds delay with some information or something. Any suggestions? I also would like to let Google index the old blog like usual if it's possible. Thanks."} {"_id": "22950", "title": "Customer Report Google Analytics", "text": "Is it possible to have a custom report per minute for Google Analytics? So far I have found that it is possible to have a report per hour."} {"_id": "622", "title": "Is there a way to get browser info on one specific page with Google Analytics?", "text": "I have an Ajax-based page which has had a history of problems with Internet Explorer (IE). I spent some time working on the JavaScript to try to get it to work in IE. After this I was interested to see how the traffic to this one specific page with IE had changed. However, using Google Analytics, I can't see a way to track \"single-page usage versus specific browser\". Is it possible?"} {"_id": "49445", "title": "Will enclosing an

element inside a

element affect SEO?", "text": "Is it okay (SEO-wise) to construct the following line as such:

These are the best

Cheap Widgets

ever.

Wondering if this looks too unnatural for search engines?"} {"_id": "12833", "title": "End date on google analytics", "text": "Everytime I open google analytics account I get some default date range which is one month from yesterday to yesterday. Is there a way to set default to be start: some day I define end: last possible day there is"} {"_id": "12831", "title": "How does Amazon.com renew or buy its domain names?", "text": "It's hard to believe that Amazon goes to a site like Godaddy.com to buy or renew domain names. Do they use registrars? If not, how do they buy & renew domains?"} {"_id": "12830", "title": "Infinite redirect loop in cPanel-purchased script", "text": "I am installing a script (that I bought on cPanel) in the root directory of my web site. When I try to install it, it gives me an error. I found that it starts an infinite URL redirect loop containing the name of my web site. Something like: `install//mywebsite.com/install=mywebsite.com/install=mywebsite.com/install=mywebsite.com` etc. until the browser refuses to continue when URL gets too long. The vendor told me I need to have *mod_rewrite* installed on my cPanel and something about _.htaccess_. How do I do fix this?"} {"_id": "12837", "title": "what redirect do i use?", "text": "I have a hosted rails application that gives every account a sub domain eg company.hosted_site.com I have pointed our customers own domain to their subdomain via a A record and a cname so it looks like their own site. But now i have 2 versions of the same site running so i need to now redirect the subdomain to the customers own domain. My question is what redirect is best for this for SEO ? thank you Rick"} {"_id": "15288", "title": "Accepting Payments Online", "text": "What's the most cost-efficient way to accept payments on my website? American Express will be the card of choice. Average sales order ~ $100-250. I am only in Alpha stage so I don't want to pay Braintree (or others) $80-100 / month, right off the bat (bootstrapping! ) Is PayPal my best bet? Again, I will be receiving payments from corporate credit card holders via my website. Does anyone know of any alternatives?"} {"_id": "8019", "title": "Optimizing perceived load time for social sharing widgets on a page?", "text": "I have placed the facebook \"like\" and some other social bookmarking websites link on my blog, such as Google Buzz, Digg, Twitter, etc. I just noticed that it takes a while to load my blog page as it need to load the data from the social networking sites (such as number of likes etc). How can I place the links efficiently so that first my blog content loads, and meanwhile it loads data from these websites -- in other words, these sharing widgets should not hang my blog page while waiting for data from external sites?"} {"_id": "8018", "title": "Whois status \"pending delete\" with expiration date in November 2011?", "text": "A friend of mine is in the process of being scammed by a domain registrar and I am trying to sort out the mess. However I could use a hand understanding some of the details. He paid for 2 years of domain name registration on 6 november 2009. The whois record reads: Domain ID:XXXXXXXXXX Domain Name:XXXXXXXXX.ORG Created On:06-Nov-2009 09:23:12 UTC Last Updated On:17-Dec-2010 00:15:10 UTC Expiration Date:06-Nov-2011 09:23:12 UTC Sponsoring Registrar:OnlineNIC Inc. (R64-LROR) Status:CLIENT TRANSFER PROHIBITED Status:HOLD Status:PENDING DELETE SCHEDULED FOR RELEASE Registrant ID:ONLC-XXXXXXX-X Registrant Name:My friend's name ... Registrant Email:Old email The registrar charged a renewal fee a week ago and is now asking an extra $150 to \"reclaim\" the domain name, even though the domain name is apparently still in my friend's name and it looks like there is still another 10 months before the expiry date. The expiration date on the WhoIs record looks right (Nov 2011), so I don't understand why the domain status says \"PENDING DELETE SCHEDULED FOR RELEASE\". Can someone explain me better what the deal is and explain what I need to do get the domain name transfered to a more honest registrar? I already have a registrar for my own domain names, been using them for 10 years without problems, so I know where to transfer the domain names to, I just don't know how to proceed."} {"_id": "8017", "title": "web designers icon avatars?", "text": "Anybody know how these are made. I see them a lot, mostly web designers have them. Are they hand made in psd or illustrator or is there a web service that converts real photos? ![alt text](http://i.stack.imgur.com/Gu5nv.png)"} {"_id": "8016", "title": "Why Google not ranking my websites..?", "text": "Please observe http://www.panbeli.in and http://www.softwaregenius.net Why is this site does not get ranked by Google Spider? It has been grayed in Google Tool Bar for a while. Google has the knowledge of it. You can search for http://www.panbeli.in and http://www.softwaregenius.net and you will see the both websites in google search,yahoo search,bing search, you found that we have lots of backlinks also ,http://www.panbeli.in alexa rank is 695,295 and http://www.softwaregenius.net alexa rank is 994,219. we have also created Blogs/News/Classified/google ADsence/press relese etc to increase the rang for http://www.panbeli.in and http://www.softwaregenius.net websites. When i check my SEO statistics using different SEO tools, it shows Google backlink is zero. however, Google just does not want to rank it. Can you tell me why? Why is it penalized? I have tried to do my best to clean it. Your kind reply is appreciated. Thanks and Regards Irfan"} {"_id": "34556", "title": "E-commerce + CMS: 2 sites or one?", "text": "Ok, let's say that a customer already has a CMS managed web site but now wants to sell goods online using an E-commerce platform (Magento in this case). My question is, does it make any difference between choosing to have just one site running both CMS and E-commerce (`www.mycompany.com`), or to have one site for the CMS (`www.mycompany.com`) and another (`www.mycompany-shop.com`) for E-commerce? I'd like to know the pros and cons of these approaches, so that I can advise the customer for the best. --EDIT I forgot to say that I'd prefer to have 2 separate web sites. This way I wouldn't have to learn how to integrate them together (one in Python, the other in PHP)."} {"_id": "8014", "title": "Google map in MediaWiki not showing", "text": "I have upgraded MediaWiki from 1.9.3 to 1.16.1 in a new server. However, the google map is not showing in the link. It's a blank in that page but in the old server with old version it is working fine. I am not a developer so I have no clue on this. Please let me know anybody have any idea on this. you can have a look on the below links http://new.realchicago.org/wiki/index.php/Archer_Heights The first link in which the google map is missing."} {"_id": "8011", "title": "Payment Gateway options other than Paypal, for sending out mass payments", "text": "We were using Paypal Payment pro earlier for the same thing, but for some reason Paypal has been given some new guideline which kinda hinder with the way we need to send out payments at the moment. We receive payments from clients and then send out payments back to vendors on a weekly basis ( deducting our cut ). Can you let me know what options are available to for such transactions other than paypal ? which is the best in terms cost of setup etc. Thanks"} {"_id": "47676", "title": "SEO and traffic flow for multi-domain web app", "text": "I'm working with a small team on a web app. **Background** For many years we have run a informational website. It gets lots of traffic (1M+ per month) and is near the top on the first page of the search engines for our keywords. This website is basically an informational database with text and illustrations. It has been curated by our \"principal\" for this time. He is an expert in our field, but has absolutely no technical expertise. Seriously... he built the whole site using static HTML pages and now has over 1000 .html files to maintain. I didn't get involved with this project until it was too late. Anyways, let's call this site info.com. **Issue** We are now creating an interactive web application that utilizes the information from the other site. We'll call this \"app.com\". Our \"principal\" has created the specifications for the app, and myself and one other developer are doing the work. In his specifications, he does not want to pull info from info.com in a script, rather, he wants users of app.com to click links to info.com as he believes it will boost the hits on info.com which ultimately lead to sales. However, as a team we are discussing ideas for how to mitigate the \"jumping around\" between sites. I suggested that when a user \"links\" back to info.com, we display content from info.com within an iframe on app.com. I liked this idea because in essence the user will see our \"app.com\" toolbar (top-page fixed, facebook style) overlayed on \"info.com\", which users of \"app.com\" will be very familiar with. The \"principal\" didn't like this idea because he believed doing this would NOT increase the pagerank of \"info.com\" and would in fact hurt it if more people access the \"info.com\" content via the iframe on \"app.com\". I seriously know nothing about SEO but my gut tells me this is not factual. TL;DR: * How does traffic volume affect SEO? * Will putting content from \"info.com\" in an iframe on \"app.com\" hurt the ranking of \"info.com\" if users of \"info.com\" start accessing the info via \"app.com\"?"} {"_id": "7456", "title": "Service and/or tool to monitor performance?", "text": "I am seeing wildly different performance from a clients web site, and would like to set up some sort of monitoring. What I'm looking for is a service that will issue requests to a couple of URLs, and report on the time it took to process the page - TTFB and time to download the entire page - that means I need something that will process javascript & css. Are there services like this? I've seen a few that monitor uptime, but they don't seem to report on the overall page performance."} {"_id": "7457", "title": "Is it necessary to use Timthumb in Wordpress 2.9+?", "text": "as Wordpress 2.9 comes with built in post thumbnail features, is it necessary to use Timthumb ?"} {"_id": "52434", "title": "How do I point my domain to my website/ip adress", "text": "I havve recently set up a website, I've forwarded it to my Ip and I'm able to access it. However I have also bought a domain which I would like to use for my website which I host through my own pc using xampp. From what I understand I would need to create a dns server, but do still not understand how to do it. So my question is, How do I point my newly bought domain to my website which is being hosted using xampp? Thanks"} {"_id": "44930", "title": "Don't $_GET value on link rewrite with .htaccess", "text": "I config enable htaccess on my server (i enable mod_rewrite and do like this: http://stackoverflow.com/questions/11064005/enable-htacess-error-on- ubuntu-12-04-with-apache2) But on localhost is run echo $_GET value php : http://oseeyou.com/test/on- localhost.png and on server don't run: http://oseeyou.com/test/on-server.png I don't understand this, may help me. Thank."} {"_id": "7450", "title": "Domains with similar names and issues", "text": "I recently purchased one of those domain names like del.icio.us. While registering I found that delicious.com was being used. **Argument:** I found that delicious.com belonged to the same category as my to-be website. It served premium delicious dishes. **Counter Argument:** My to-be domain though belonging to the same category, specialized in serving free but delicious dishes or in giving out links(affiliate) to other sites serving premium delicious dishes. **Additional Counter Arguments:** 1.delicious.com was not in English. 2.the del.icio.us in my domain name though having the same spelling, is not going to be used in the same fashion. For eg.(this may not make sense, because the names have been changed)the d in delicious on my website actually stands for the greek letter Delta(\u0394/\u03b4) and since internationalized domains are still not easily typable, I am going for the english equivalent.The prefix holds importance for the theme of the service which my website intends to offer. **My Question:** Can I use the domain name del.icio.us for my website? How are these kinds of matters dealt? (The domain names used are fictitious. And I have already registered the domain but have not started using it.I chanced upon this domain name because it was short, easy to remember and suited the theme of my website) Update: Can I use the domain name for an unrelated service. I have no wish to trample their rights. Update: Are there any real life examples of similar name use in different fields without the companies fighting it out? The domain name is pretty generic."} {"_id": "18881", "title": "How to recover a website from google cache?", "text": "My host provider told me last night their server crashed and they had restored the website from from an image they had from months ago, They told me this is the only backup they have. I also have no backup, as my backup server crashed few days ago. Through google cache I can see all the website as it was few days ago. Is it possible to recover my website using google cache or is there any tool which help me to recover? Thanks,"} {"_id": "30218", "title": "Retrieving website", "text": "> **Possible Duplicate:** > How to recover a website from google cache? Does anyone know of reliable websites (Warrick is currently closed for new requests) or programs (Mac-compatible) where I can retrieve a website that was deleted? I wasn't informed by my blog's host that my website was deleted because hosting services was terminated by the company. Since I have 2 years worth of posts at the website, I was wondering if it would be possible to retrieve them. I was using Wordpress.org for my website."} {"_id": "26629", "title": "Use - or misuse - of ETags", "text": "I'm using an application that sets ETags by md5()ing the URL. As I understand it, that's quite insane. In effect, it means that content for a specific URL won't be fetched anew ... ever. Unless a hard-refresh is sent or, maybe, the browser is restarted (yet to test the latter). Is this abuse of the ETag header?"} {"_id": "2035", "title": "CA For A Large Intranet", "text": "I'm managing what has become a very large intranet (over 100 different hosts / services) and will be stepping down from my role in the near future. I want to make things easy for the next ~~victim~~ person who takes my place. All hosts are secured via SSL. This includes various portals, wikis, data entry systems, HR systems and other sensitive things. We're using self signed certificates which worked o.k. in the past, but are now problematic because: * Browsers make it harder for users to understand exactly what is going on when a self signed certificate is encountered, much less accept them. * Putting up a new host means 100 phone calls asking what \"Add an exception\" means What we were doing is just importing the self signed certs when we set up a new workstation. This was fine when we only had a dozen to deal with, but now its just overwhelming. Our I.T. Department has classified this as `ya all's problem`, all we get from them is support for switch and router configurations. Beyond the user having connectivity, everything else is up to the intranet administrators. We have a mix of Ubuntu and Windows workstations. We'd like to set up our own self signed CA root, which can sign certificates for each host that we deploy on the intranet. Client browsers would of course be told to trust our CA. My question is, would this be dangerous and would we be better off going with intermediate certificates from someone like Verisign? Either way, I still have to import the root for the intermediate CA, so I really don't see what the difference is? Other than charging us money, what would Verisign be doing that we could not, beyond protecting the root CA cert so it can't be used to make forgeries?"} {"_id": "1377", "title": "How bad is it to use display: none in CSS?", "text": "I've heard many times that it's bad to use `display: none` for SEO reasons, as it could be an attempt to push in irrelevant popular keywords. A few questions: 1. Is that still received wisdom? 2. Does it make a difference if you're only hiding a single word, or perhaps a single character? 3. If you should avoid any use of it, what are the preferred techniques for hiding (in situations where you need it to become visible again on certain conditions)? Some references I've found so far: Matt Cutts from 2005 in a comment > If you're straight-out using CSS to hide text, don't be surprised if that is > called spam. I'm not saying that mouseovers or DHTML text or have-a-logo- > but-also-have-text is spam; I answered that last one at a conference when I > said \"imagine how it would look to a visitor, a competitor, or someone > checking out a spam report. If you show your company's name and it's Expo > Markers instead of an Expo Markers logo, you should be fine. If the text you > decide to show is 'Expo Markers cheap online discount buy online Expo > Markers sale ...' then I would be more cautious, because that can look bad.\" And in another comment on the same article > We can flag text that appears to be hidden using CSS at Google. **To date** > we have not algorithmically removed sites for doing that. We try hard to > avoid throwing babies out with bathwater. (My emphasis) Eric Enge said in 2008 > The legitimate use of this technique is so prevalent that I would rarely > expect search engines to penalize a site for using the `display: none` > attribute. It\u2019s just very difficult to implement an algorithm that could > truly ferret out whether the particular use of `display: none` is meant to > deceive the search engines or not."} {"_id": "65065", "title": "Does Visibility:Hiden and Display:None Affect SEO", "text": "If I had some text that initially had the CSS property visiblity:hidden or display:none Would that effect how much the content is weighted by search engines? Could using one of those properties on key content on my website have an adverse effect on SEO?"} {"_id": "43926", "title": "Will Google Crawl Display:None Dropdown?", "text": "I have a navigation set up like so:
  • Dropdown Link
  • Dropdown Link
All the dropdowns are initially hidden and then they are shown based on `data- dropdown` when hovering on the navigation. It works perfect, but I'm really concerned about whether google will crawl the links if it's hidden on page load. What's the best way to approuch this?"} {"_id": "64649", "title": "Will I get penalised by google for overflow hidden?", "text": "I have 2 divs next to each other, div A and div B. Div A displays a whole wordpress post and div B displays about 15 excerpts of posts with the same tag. To get this looking nice I set the hight of div B to be the same as the hight of div A with javaScript. This means alot of post excerpts will be hidden due to overflow hidden on div B. Will Google penalise me for this, and if so, how can I make this work and still please Google? Ps. I would gladly prevent the excerpts from being indexed but the main post must be indexed Ds."} {"_id": "17666", "title": "using Accordion, Jquery ,will it affect SEO?", "text": "> **Possible Duplicate:** > How bad is it to use display: none in CSS? I am planning to use Accrodion and Jquery in My website. previously to animate the contents in the website i used Flash but content (text ) in the flash are not indexable in the search engine and since major contents of my site comes with in animated Tab using Accordion and J query and most of the text are hidden in DIV so i want to know whether this will effect the SEO. Because most of the contents are hidden initially by default and by definition of Black hat Seo,this sounds similar to black hat SEO ... will google treat my site same as Black hat SEO or Is using Accordion in website is acceptable in SEO ?"} {"_id": "45356", "title": "Preloading web fonts with visibility: hidden; bad for SEO?", "text": "I've started using a method to preload the webfonts via hiding them; whilst it works great, I'm not sure if \"hiding text\" will be viewed as bad for SEO? All I do is something like this: CSS; .preload { position: absolute; visibility: hidden; font-family: Roboto, Arial; font-weight: 300; } HTML:
Preload Light Font
Thoughts!?"} {"_id": "53616", "title": "Bootstrap hidden text and page rank", "text": "I am using Twitter bootstap to show different text when on mobile mode, tablet mode and desktop mode. I was worried that since the (some) text is hidden on certain modes, would it effect my website page rank."} {"_id": "13408", "title": "DIsplay none and SEO", "text": "> **Possible Duplicate:** > How bad is it to use display: none in CSS? So i have a nav with tons of stuff and we want to scale back a few of the a tags in the nav but still have them on the page. So for example I was thinking of using jQuery or css display none to hide the text...but we still want the text on the page for the google crawler. Does this technique hurt our current SEO rankings or is it frowned on by google. Here is my site and the nav is on the left....If this will hurt the SEO status is there anything else we can do to keep the text on the page without effecting SEO"} {"_id": "33577", "title": "Official Google 10 minutes SEO video and page related terms", "text": "> **Possible Duplicate:** > How bad is it to use display: none in CSS? According to this link, one should add some terms related to the page content. I am adding the text like: `
term1 term2 term3 term4 term5
` **My questions:** Does it affect the SEO for my website if I am not displaying the terms on page? (I am using `display:none`). In other words, the page related terms are in source code but not being displayed."} {"_id": "68241", "title": "Make Offscreen Sliding Content Without Hurting SEO", "text": "On my website I have content which is positioned off the screen, and then slides in when you click a button. For example, when you click the news button, content slides in with news. It didn't occur to me that this might be labeled as a black hat SEO technique, because I have content positioned off the screen with CSS that links elsewhere on my site, and a search engine could very easily interpret that as me hiding content for SEO purposes by positioning it off screen. Obviously, my intention was not to hide content, but was to make a sort of UI/UX content slider where content slides into view when a button is clicked. How can I make something to this effect (where content slides in and out), that would not comprise SEO?"} {"_id": "49139", "title": "One Page Website, with jump links, but hide, display:none all other content except jump link clicked", "text": "I'm wondering how this website design/structure will work for SEO. I have example.com that I'm building. The only actual URL for the website is example.com. However, it will appear to the user that their is 8 different pages. For example. There will be example.com, example.com/#about, example.com/#contact, example.com/#services, etc. I'm using jquery to hide all the other 7 \"pages\" and only show the link that is clicked. So if a user goes to example.com, it will hide all the 7 other \"pages\" and only show the example.com. If the user then clicks on the link example.com/#about, it will hide example.com content, and keep all the other pages content hidden, but then only show example.com/#about content. So everything on the website stays the same, the only thing that changes is the \"content\" div when they click on a menu link. I notice that this is VERY good for user experience in that it loads the whole website code when they first come to the website, so when they click between \"pages\" it is instant. I'm interested in getting answers to why this is a bad idea for like SEO? or any other reasons? Anyone else ever use this kind of setup? Thanks."} {"_id": "68245", "title": "Does CSS Positioning Affect SEO", "text": "If I positioned the very first content that appears in my code below the fold, would that content be given less weight and therefore be less effective with SEO? In addition, if I had a large image that took up most of the top of the screen and resulting in my content being below the fold or toward the bottom of the screen, would that content be given less weight? **Note** This is content that occurs early on in my code. I'm not talking about having a ton of content and if the content that occurs later would be given less weight, but if content that occurs early on put ends up below the fold would be given less weight."} {"_id": "18385", "title": "Can search engines see HTML elements hidden by CSS?", "text": "> **Possible Duplicate:** > How bad is it to use display: none in CSS? I was wondering if search engines can read `` elements with `display:none`. Is it true? Can search engines see elements hidden by CSS rules?"} {"_id": "56920", "title": "Showing other name than website name after searching on google", "text": "When I search the same URL on Google, its showing, 'Total Visits' as a header. It not showing my website name. Please check the image. What should I do ? ![enter image description here](http://i.stack.imgur.com/UbiiU.jpg)"} {"_id": "53960", "title": "Why is my website's title not showing in search results?", "text": "When I search for my website, the title of my website appears as \"New York web design\". ![Screenshot of Google search result](http://i.stack.imgur.com/yieAu.jpg) But my website's actual title is different: \"Web Development New York | New York web design | SEO New York\" How can I rectify this issue?"} {"_id": "57280", "title": "Google search mismatch title", "text": "The title of my website is \"OTN | Ontario Telemedicine Network\" but Goggling for \"OTN\" displays it as \"Ontario Telemedicine Network: OTN\" in the search results, other search engines display it properly. How can I direct google to display the title as it is? PS. the title was updated many months ago and I have checked Google webmaster tools for this option but couldn't find anything."} {"_id": "56524", "title": "Google Adds Unwanted Name to Site Title!", "text": "I've set a different \"Site title\" that its not in site Domain, for example: Site domain: \"setcars.com\" Site title: \"Hello world - new cars!\" Internal links name to index page (by menus, footer, etc): \"SetCars\" \"SetCars\" is not used as TITLE tag anywhere, but google adds it to titles on SERPS: \"Hello world - new cars! - SetCars\" WHY? and how to remove it?"} {"_id": "53065", "title": "Google shows wrong title", "text": "Few weeks ago I redirected my site using 301 redirect. I went to G Webmaster and set it up. I was using fetch as google, but it still shows wrong title. I really don't know why. Everything is fine on my website. Last cache was 2 days ago. My previous site had similar title to current one. But google indexed my new website few times already and still shows old title"} {"_id": "42660", "title": "Implementing inline editing for a web order form without using Javascript", "text": "As the title states, I'm trying to find a good solution to implement inline editing for a web order form with just HTML/CSS. The only mediocre solution I have right now is this -> http://jsfiddle.net/jmggp/1/ But I think it looks unprofessional. Are there any other solutions that I can try to learn about?"} {"_id": "35931", "title": "how google ads refer to dynamic content?", "text": "I have mobile app / site and I want to use google ads with method PPM (pay per impression) I am changing the page content programmatically with java script . so if the user navigate to new page I actually just change the content in the same page. 1.Is there any problem with this by google? 2. do i get paid for this content change like if page really navigate? Thanks."} {"_id": "16723", "title": "Is it feasible to run a public-facing website from my home network?", "text": "I have to a get a very simple data entry website up and running. This website will be accessed by only a few people at a time (at most), but usually only one person, and not very frequently - maybe a coupel hours at a time, not every day. I really don't want to pay for hosting if I don't have to. I have AT&T U-Verse Internet Service. How feasible is it for me to setup a simple Windows/IIS website on my home network that can be accessed publicly? What's the minimum Internet speed I would need? How can make sure that the IP Address for the public website is static, so I can setup my DNS and forget about it? Any additional thoughts or suggestions on this type of setup would be greatly appreciated. If this is not feasible, or easy, I will likely just go with a hosting service, and I already have one picked out, so I don't need any suggestions for this."} {"_id": "35935", "title": "How to specify importance of html elements?", "text": "Is it possible to specify what elements of the page are important, or, more specifically, what elements of the page **are not important**? I'm using HTML5 new elements (nav, header, footer, section, article, aside...), but there is sometimes my login form (in the header of my page though) in the Google description of my website pages... Is there a solution to resolve this problem?"} {"_id": "28386", "title": "Is it better in terms of SEO if I develop my Website meant for marketing my service on a regular .com domain compared to a wordpress site?", "text": "Basically I am asking is whether Google et al are going to _treat_ my webpages developed through wordpress in the same way as they would treat other websites with a *.com? Also, as far as SEO goes do I have to jump through some additional hoops if I am working with a wordpress site ? Would I have any advantages instead ?"} {"_id": "3131", "title": "Is it better to have an ErrorDocument 404 redirect back to the homepage or a standard 404 error page?", "text": "We run an ecommerce site that was setup by a third party ecommerce software provider, basic shop with product pages, basket and checkout. The third party vendor set up the htaccess file so that if a non-existent URL is enetered it redirects to the homepage instead of a 404 error being generated. This results in google webmaster reporting duplicate titles and descriptions for pages that no longer exist. I think that it would be better for a dedicated 404 error page to be displayed rather than redirecting back to the homepage So is it better to have a website redirect a 404 to the homepage than have a dedicated 404 page instead?"} {"_id": "50414", "title": "What to use for rapid PageSpeed analysis \u2014 Google Insights or Chrome extension?", "text": "Which score is most reliable (and complete) to use when comparing different sites and pages to get a quick idea (e.g. between competitor websites without detailed analysis)? I am asking because for the same page on the same site, the score is over 30 marks higher via Google Insights online, which leads me to believe that the official Chrome extension is more accurate (e.g. worse score). Some of this has been addressed in detail before, but for the purposes of doing quick analysis \u2014 what accounts for the different PageSpeed scores?"} {"_id": "65080", "title": "Should you let Googlebot crawl and index your entire site before updating it again?", "text": "I have heard from various people that \"Google has scanned x% of the site\". Some have told me at times not to upload anything, nor do any changes on the site \"since Google hasn't finished scanning the site\". I asked one of them what they mean by that and he told me that if you go to Webmaster Tools and at the crawl statistics there is a graph that shows pages crawled per day. Also, by the the third or fourth time a page is scanned then google's index is updated. He even told me that it is better to create a backlink from an site (that will interview me) in about a month, not now, since Google hasn't finished scanning the site after some big changes in the url structure, so the backlink will go unnoticed. Is there any logic in the above? In which ways can you use this graph and what assumptions can you derive? Should someone ever stop doing anything \"because Google is scanning\"?"} {"_id": "27370", "title": "Does including Google Analytics on a page makes you run the risk of failing a PCI audit?", "text": "My webshop is currently in the process of becoming PCI compliant. For business intelligence reasons we would like to include Google analytics code (or a code from a similar package) on the checkout page of our payment funnel. As this involves including 3rd party JavaScript code on a page which handles customers credit card details I am wondering if this type of integration runs the risk of not passing a PCI audit."} {"_id": "65083", "title": "Google Adwords - how to specify AND and OR in Keyword", "text": "There are many ways to define keywords for a campaign. However is there a way to make ORs? While searching in google it would like '\"keyword (ketwordOr1|keywordOr2)\"'. That would return results such as \"keyword keywordOr1\" OR \"keyword keywordOr2\" but not keyword alone."} {"_id": "13602", "title": "Can we trademark the name of our group of which the founder and president is no longer associated?", "text": "I am looking for advice. We have a social group. Our president chose to leave our group, (do not wish to name the name). This president founded the group and choose the name. The group pays dues and has annual parties of which the proceeds go back to the group for other functions. The president stole the money. When we confronted the president, they quit the group. I have checked and the name is not trademarked. We want to trademark the name to prevent the former president from having a claim to the money under the group name. Can we do this legally?"} {"_id": "55331", "title": "Tell Google the list of URLs to crawl", "text": "Consider the case of Quora, which is really SEO-friendly. The problem is that there are **no** links to questions from Quora homepage. Essentially nothing (even via an indirect URL). So I could never find any link to Quora questions from its homepage; I could only do so via Google (or other search engines), or have a Quora account. So my question is how to tell Google the list of pages to crawl without listing them all in the homepage (or via an indirect links)?"} {"_id": "55330", "title": "Will SEO work on semi-one-page website?", "text": "I am setting up a cafe. I read a lot about SEO not working well with one page websites. As such I am thinking of creating an in between with the content structure listed below. Will this be effective for SEO? * Home Page (one page) - keywords : Cat Caf\u00e9 Singapore, Singapore Cat Caf\u00e9, Cat Caf\u00e9, * About Us (one page) * Meet the Cats (one page) * Price(one page) * Contact Us (one page) * Menu (stand alone) \u2013 keywords : Coffee Singapore, Tea Singapore, Caf\u00e9 Singapore * Research (stand alone) , keywords : Cats Singapore, Cats * Blog (stand alone) \u2013 keywords : Cat Caf\u00e9 Singapore, Singapore Cat Caf\u00e9, Cat Caf\u00e9, * Shop (Stand alone ) \u2013 keywords : Cat Toys, Cat Food * Adopt a cat (stand alone) \u2013 keywords : Cat Adpotion, Singapore Cats,"} {"_id": "2746", "title": "Variable IPs: How variable are they? Best practices for tracking", "text": "There are a lot of questions on StackOverflow relating to session security / session hijacking, but there doesn't seem to be a really good solution to the problem. The three most common suggestions are as follows: 1. Track the users IP address as part of their $_SESSION data, and possibly invalidate a session if it changes. The downside is that lots of users have dynamic IP addresses, so you risk invalidating a user seemingly at random (their perspective). 2. Same as 1., but using a User Agent. Two issues here: there may not be a UA to track, and they can change during browser upgrades, etc. 3. Second cookie, with a unique token. The problem here is that if an attacker gets a hold on the normal session cookie, they're very likely to be able to get a hold on your secondary token as well. So, with these three options it seems that IP address is the best option, since you're guaranteed to be passed one and its independent of physical security (and if the user is physically compromised, you lose regardless). With that in mind, I have a couple questions relating to IP address changes: 1. How often would a users IP address really change under normal conditions. I have DSL at home, with the usual dynamic IP concerns, and according to gmail my IP hasn't changed in days. AFAIK, this only really happens when the modem cycles anyway, right? That seems like a rare enough event that it _might_ be ok to invalidate the session. 2. I think I remember Jeff saying in one of the SO podcasts that they did something similar, though it was possibly for something else. The idea was that using the first two (I believe) octets of an IP address could be considered \"close enough\" in some circumstances. This allows a user to move around on the same ISP, but the system would notice if the user was suddenly in another ISPs range. Is this a viable tactic?"} {"_id": "63493", "title": "Google not recognizing microdata?", "text": "I put in microdata to one page of a site I help manage using schema.org. Using the Google webmaster tool test, the page checks out and displays what it sees as the microdata properly. But when I go to the Structured Data page in webmaster tools, it keeps saying the site does not have any. I put it in 2 weeks ago. Us it just something that take a while for it to recognize? Or does microdata have to be on every page for it to be recognized or something?"} {"_id": "55624", "title": "How to interpret Google policy on crawling and indexing search results pages?", "text": "I just learned that we can be penalized for letting Google crawl and index our database search results. http://webmasters.stackexchange.com/a/55599/33777 **Question:** If they don't want to list Yellow Page type results, then why do they? These sites have been around forever, and the domains haven't changed. I just did a search for specific keywords on a friend's website. He ranks #7. The first six are all search results pages for well known Yellow Page type sites. I'm a back-end developer so this all new to me. I reviewed the Webmaster Guidelines and saw this (emphasis mine): > Use robots.txt to prevent crawling of search results pages or other auto- > generated pages **that don't add much value for users coming from search > engines**. That is very subjective. * My client has a page listing the names of all companies in our database. * Each company name is a link to the search result page for that exact phrase. * That result page in turn links to their company profiles in the different publications. * The same company might have a profile in more than one publication. * The profiles may be similar, but would list different product categories depending on the publication. This was originally set up because the client was trying to compete with Yellow Pages and the like. And we are ranking pretty well when people search for particular companies. But I don't want to get penalized. We were linking to the search results rather than directly to the profile because one company might have multiple profiles. However, the client now wants to segregate the different publications more. So, I can save a click for users if the company list links directly to the profile and skips the search results page. **Question:** Is the list of company names and the profile for each company acceptable content for Google? Do you think we will be penalized for letting them crawl and index that? We were just about to add a similar list of all categories in the database where each category would link to a list of the companies in that category. I think this has _value for users coming from search engines_. But it's subjective. **Question:** Since it is dynamically generated, Google could randomly request words like Viagra and we would present a \"No matches found.\" page. That page would have a _noindex_ meta tag. But is this enough to get us penalized? **Note:** The actual search form uses POST which I believe Google avoids. We would only generate links for exact category names that exist in the database. So we wouldn't be inviting them to crawl a search results page, but more a directory landing page. However, nothing is there to stop Google from hunting for content by manipulating the URLs. **Vent:** I know some basic SEO, and I've always gone with the idea of thinking about our users first - provide them with the content they are looking for - and let Googlebot figure things out on its own. It seems counter-intuitive to me to have to tell Google to stop crawling my site. The same thing goes for nofollow on partnership links (which I also just learned about). IMO, Google should just figure out what is relevant/valuable and display that. They shouldn't penalize sites for having content that doesn't interest them. **Aside:** If they don't want to crawl useless pages, why are they still requesting pages that have been sending 301, 404, or 410 for more than a year? And no, there are no inbound links to these pages."} {"_id": "25771", "title": "What forum software should I use?", "text": "_This is a general, community wikicatch-all question to address \"I need a forum script that does x, y, and z...\" questions._ _If your question was closed as a duplicate of this question and you feel that the information provided here does not provide a sufficient answer, please open a discussion on thePro Webmasters Meta._ * * * I have a list of features that I want for my website's forum script: where can I find a (free or paid) script that includes all of them?"} {"_id": "26804", "title": "Forums/CMS/BBS that actually has a bulletin board look", "text": "> **Possible Duplicate:** > What Forum Software should I use? Hi I'm wanting to build a bulletin board system for my community which can create a view that actually looks like a \"real-world\" bulletin board or cork- board. So in addition to the traditional forum view which has a hierarchy of topics in a full-width screen, a page custom to the user could be presented where selected posts could be laid out. The posts might look like sticky notes or boxes, laid out around the page potentially with images and text, rather than being full-screen width text entries. I'd also like the system to be able to mail a version or screenshot of the custom user page to the user on a weekly basis. Does anybody know of any, highly preferably open-source, solutions that come with a feature like this?"} {"_id": "9612", "title": "Good message board for a website (e.g. phpBB)", "text": "> **Possible Duplicate:** > What Forum Software should I use? What are the best (and most widely used) Linux-based **message board** softwares, and the **_pros_** and **_cons_** of each. e.g. Security Vulnerabilities, Performance on a cheap server, comes pre-packaged [RPM or DEB]. I am looking for _the best_ message board software for my website. A VPS can run almost any software, so the sky is the limit! 1. _Free, doesn't require unreasonable number of hyperlinks to their website_ 2. Security focused / Widely Used, vulnerabilities are found and fixed quick 3. Easy to keep up-to-date, i.e. prepackaged / auto-update in some way 4. Moderator features [like pinning / message preamble], account management 5. Themeable, customize appearance a bit * * * The contenders appear to be * **phpBB** - Undeniably popular, modular. * **MyBB** - Used to be commercial, great features. LGPL"} {"_id": "2097", "title": "Online forum software or services", "text": "> **Possible Duplicate:** > What Forum Software should I use? What forum software is out there? I've used PhpBB in the past, but that was a long time ago and I'm guessing there might be better stuff out there. My requirements are: * registration required to post (name, email, password) * public viewing * some ability to customize * needs to look reasonably good (it's not for geeks) * hosted or self hosted * if self hosted it needs to be PHP & MySQL * free or paid (I can't add tags, but I wanted to tag this \"forums\".)"} {"_id": "4891", "title": "What's the best free forum software out there right now?", "text": "> **Possible Duplicate:** > What Forum Software should I use? Either php/mysql or aspx/sql I checked out a similar thread but there weren't any real answers... Thanks"} {"_id": "14425", "title": "Bulletin Board System with tagging, email notification", "text": "I am looking for nice BBS system, Bulletin Board System, Discussion Board, or nice in-company communication platform. There are lots of people, about 30 people, joining in our project. We would like to share idea among us on that platform. We can post questions and concerns related with the project, and we would like to respond each other. Here is my list of functionality I want: * Tagging Thread e.g) Announcement, Finance, Legal, Idea. One thread can have multiple Tags. * members can set on/off to receive email when new comments are posted. They can set on/off on each Tag. e.g) one member on to receive email related with \"Announcement\", but off to receive \"Finance\". * Thread owner can change threads' tag any time. * Thread can have several type of post. * Thread can be \"vote\" thread. Everyone can vote their opinion. * Thread can be \"action plan\" thread. In this thread, \"who\" will \"what\" remains in the thread. By viewing all \"action plan\" thread, all action plans needed in the company is visualized."} {"_id": "5321", "title": "Looking for a simple forum script", "text": "> **Possible Duplicate:** > What Forum Software should I use? I have a working website and I need to integrate a forum into it. The problem is, I need the forum script to use usernames and passwords from a mysql table that is already created. This is why I need a forum that supports translations and has a simple code I can adjust to my databse. Any ideas on such forum scripts?"} {"_id": "6983", "title": "Suggestions of internet forum software", "text": "> **Possible Duplicate:** > What Forum Software should I use? I'm going to be setting up a few internet forums to test out the software. I was going to be doing this with vBulletin, but they don't let you download a trial version. Could anyone make any suggestions of any good forum packages and their advantages and disadvantages? (And maybe forum software in general pros and cons.) The forums will be hosted on a Ubuntu Linux virtual machine using Apache. **EDIT:** I should also point out that one with better security would be desirable - although if I'm honest my expectations on that front are low :P"} {"_id": "53610", "title": "Forum software with good threaded-view support?", "text": "I'm looking for good forum software that allow a threaded view (similar to reddit or Disqus), at least up to two or three \"levels\" of replies. The phpBB people seem to be vehemently against threading at all, I believe MyBB supports only _one_ level or replies, is there any other good forum that supports threading? Phorum is mentioned as having support for threaded views, but there were vague hints in some discussion I came across that you had to go through the replies one by one like in the ancient days rather than having a single expanded look. Is this true of Phorum? Other candidates I came across were FUDForum, mwForum and Drupal's forum, your views on these and other possible options would be greatly appreciated."} {"_id": "8650", "title": "What is a CMS with good user forums support?", "text": "> **Possible Duplicate:** > What Forum Software should I use? If I'm starting from scratch with any CMS I want, which one has good support (or a plugin) for discussion forums?"} {"_id": "24036", "title": "Which forum applications can integrate with Facebook?", "text": "> **Possible Duplicate:** > What Forum Software should I use? What is the best/most compatible forum software which features almost complete integration with Facebook? The main feature I ask is the Facebook Connect feature (user could use Facebook account to register). **But** it would be more perfect if other Facebook features could be integrated to. Something like, subscribe thread which appears to Facebook notifications, easy sharing to Facebook, etc. I have vBulletin, Invision Power Board, and SMF in my mind, but I'm open to more suggestions.."} {"_id": "1965", "title": "Bulletin board software with voting capability for each post", "text": "> **Possible Duplicate:** > What Forum Software should I use? I run my system on a linux machine. Let's say I use apache or cherokee webserver, for simplicity. I want to run a bulletin board software for discussion purposes. However I do not want traditional systems like vBulletin or vanilla, as systems based on these tend towards too much bickering and clique-formation. I was inspired by the system in meta.stackoverflow in which each post can be voted up or down. This would instantly be a medicine for all the jerks, since their posts will get downvoted quickly. Moreover honest criticisms will get upvoted. So I feel this will be a great improvement. However the meta.stackoverflow system is not suitable for a discussion environment. I would want a traditional bulletin board system itself; but with the added capability that votes can be given up/down for each post, and for each post the number of votes is displayed. For example let me mention that in some wordpress blogs, comments can be voted up/down. That capability is exactly what I want; it is just that I need it in a bulletin board software rather than in a wordpress blog. Question: > Does there exist a bulletin board software with voting capability for each > post? Here I must stress that I do not care about the total reputation of a user. That is something I do not want to encourage. I have asked this question both here and at serverfault, since it was not clear where should this be asked. Feel free to close it at the appropriate site and I need answers only from one place."} {"_id": "25160", "title": "Best lightweight PHP/MySQL forum with spam-protection", "text": "> **Possible Duplicate:** > What Forum Software should I use? I have already searched and checked out posts like this and this but it isn't what I'm looking for. I'm looking for a lightweight PHP/MySQL forum with decent (good) anti spam functionalities. I have been using punBB which is light and fast but doesn't stop billions of spammers to post boring links, I have tested PHPBB which I found really heavy (and it doesn't stop spammers either). I'd like the possibility to receive a mail to 'accept' new inscriptions, auto- shut down inscriptions that have never posted (after some time) and so on and a captcha at inscription time (if it is easily modifyable that would be a plus), (public) blacklists and/or adaptive filters to stop spam would be nice too. Does this exist in any free/cheap forum?"} {"_id": "9997", "title": "Group \"Discussion\" software?", "text": "> **Possible Duplicate:** > What Forum Software should I use? My client wants a \"lite\" forum... not unlike these stack exchange sites, but even a little lighter. There's a screenshot of the discussion group she likes most, below. You can also go here to see it for yourself it you like. I don't think traditional forum apps will display, functionally, in this manner. **Is there any software I can use to get a similar result? A web service would be acceptable as well.** ![enter image description here](http://i.stack.imgur.com/NtaIS.png)"} {"_id": "8973", "title": "Free forum engine with good anti-attack mechanisms", "text": "> **Possible Duplicate:** > What Forum Software should I use? I am looking for forum engine (for discussions) with good attack countermeasures built in. Windows (preferrably) or Linux. Free (as beer). I think about registration flooding and blocking user accounts attacks. For registration, such engine should have at least: * captcha * blocking mulitple registrations from the same IP * providing login (for logging in) and user name (for displaying the author of the posts) For logging in: * no blocking on multiple tries -- instead after X try sending via mail a token, the third piece needed for next login -- without it logging in will be impossible (it would be similar to activation process) The engine should be designed with two ideas in mind: * protecting engine against attacks * 0 penalty for decent users Thank you in advance for your help and recommendations."} {"_id": "10863", "title": "Forum that integrates into CMS and has curated category pages with tagged threads", "text": "> **Possible Duplicate:** > What Forum Software should I use? I'm looking for a forum that meets these requirements: * Login using Facebook/Twitter/OpenID etc. * User profiles with reward system * Voting/thumbs up function * Categories and tags for sorting threads * Custom category pages with moderated static header * Embeddable threads and categories (For example, a whole category or single thread can be integrated into wordpress) * API to users, discussions etc. I've looked at forums like Vanilla, Disqus, OSQA etc, but none seem to match the above \"hybrid criteria\". Hosted or self-hosted doesn't matter but I'm really looking for something that can be integrated into an existing CMS to replace comments while at the same time have curated category pages and user profiles. Thanks."} {"_id": "6059", "title": "Forum software advice needed", "text": "> **Possible Duplicate:** > What Forum Software should I use? we want to migrate our sites current forum (proprietary built) to a newer, more modern (feature rich) platform. I've been looking around at the available options and have narrowed it down to vBulletin, Vanilla or Phorum (unless you have another suggestion ?). I hope someone here can give me some feedback on their experiences either migrating to a new forum or working deeply with one. The current forum we have has approx 2.2 million threads in it and is contained in a MySQL database. Data Migration is obviously the first issue, is one of the major Forum vendors better or worse in this regard ? The software needs to be able to be clustered and cached to ensure availability and performance. We want it to be PHP based and store it's data in MySQL. The code needs to be open to allow us to highly customise the software both to strip out a lot of stuff and be able to integrate our sites features. A lot of the forums I've looked at have a lot of duplicate features to our main site, in particular member management, profiles etc. I realise we'll have to do a good bit of development in removing these and tieing it all back to the main site so we want to find a platform that makes this kind of integration as easy as possible. Finally I guess if 'future proofing' the forum (as best as possible) given the above. Which platform will allow us to customise it but also allow us to keep instep with upgrades. Which forum software has the best track record for bringing online new features in a timely manner ? etc. etc. I know it's a big question but if anyone here has any experience in some or all of the above I'd be very grateful."} {"_id": "23004", "title": "Which forum software has a \"like/don't like button\" in the post (and other features)?", "text": "> **Possible Duplicate:** > What Forum Software should I use? I want to build a new blog (on Wordpress) and I want to insert a forum in the website. I don't want to use PhpBB for the forum section (too complex to configure and manage) and I'm searching a simple solution. The feature I absolutly need: * like/don't like button in every post * polls * attachment and image upload * tag system for the post Other appreciate features: * import post from another PhpBB forum * FB connect (for the user login) * integration with wordpress * reserved section (for only a group of user) At the moment I have found these solutions: * bbpress: all feature above but it doesn't have the \"like/don't like\" button * OSQA: I don't know if have the pools and attachment function Anyone can help me and suggest other solution?"} {"_id": "14570", "title": "Alternative to vBulletin", "text": "> **Possible Duplicate:** > What Forum Software should I use? I'm currently using vBulletin along with a bunch of plugins like vBSEO, shoutbox and others for a sports discussion site. I just don't have the time to keep up with the upgrading of the vBulletin software (and then all the plugins). And I'm not crazy about vBulletin in general. Last I checked there was no good single sign on mechanism to integrate it with other systems and the social aspects of the site, of which discussions are really all about, seem lacking. Are there good alternatives which are easier to manage (ideally hosted systems), easier to integrate and more social? I'm more than happy to pay, but this community is only about 4,000 strong... so anything above a couple hundred $$ per month will have to be pretty spectacular."} {"_id": "11467", "title": "Whats the best CMS for Sports Club?", "text": "> **Possible Duplicate:** > What Forum Software should I use? I have to renew a website for a sport club. There is some self programmed CMS in use at this time. But I need a new solution. A popular CMS that could be up to date over the next 3 or 4 years. I took a look at Drupal 7. Not sure about whether it's easy possible to create member profiles (they should be able to edit them self), a board and team overviews. Would be cool you'll share the link of a tutorial for this needs if you have one. _Or..._ **Any suggestions for a good CMS with my needs?**"} {"_id": "9082", "title": "Which forum software has the most advanced community/GetSatisfaction type features?", "text": "> **Possible Duplicate:** > What Forum Software should I use? I need to assemble a **GetSatisfaction/Lithium/Jive type support forum/community**. The first is not available in the desired language and the last two are priced for the enterprise market. I did research some other options (open source or SaaS) but they all seem to be either: * kind of dead (open source options) * too focused in gathering ideas/feedback (uservoice) * strictly support without the community/voting features (zendesk) **I need an open forum (people powered support/UGC with community/voting features) & Facebook/Google/Yahoo/MSN account sign-in.** Therefore I will have to do some of the work on my own. I want to **piece things (plugins/mods/etc) on top of a standard forum platform to give it the features I need**. For this purpose, I want to use a **mature** product with widespread userbase, active community and lots of plugin options. I believe most will agree that my options therefore are: * vBulletin * phpBB * SMF Here are the questions: 1. **Which one of the three above offers the easier path towards the desired goal**? 2. **Which one of the three above has the most advanced features related to the desired goal?** Of course I dont expect anyone to **know** these answers _cut and dry_. I am hoping to hear some experiences and see some examples. Also, it would be great if both those questions had the same answer, but I am not going to get my hopes up... PS: I wish I could add the tags \"phpbb\" and \"smf\" ;)"} {"_id": "28104", "title": "Which free PHP based forum is the easiest to extend or customize?", "text": "> **Possible Duplicate:** > What Forum Software should I use? I am looking to start a new forum, with a traditional forum layout (like webhostingtalk, for example). In this space, I know phpBB and SMF are strong contenders. I do not know for sure the names of other great forum software that might exist... My most important need is that it should be easy to modify the display area, at the least, without having to dig too much into the core. Drupal excels in this area with its templating system, but the forum module doesn't look like the forum interface most people are used to... It would be a great plus if the software has alternative Captchas like question based or invisible Captcha. If it doesn't, I would like to be able to code it in without much trouble (that is, the software exposes a good API)"} {"_id": "21132", "title": "PHP-based forum software for maximizing SEO?", "text": "> **Possible Duplicate:** > What Forum Software should I use? I ask because I am skeptical of phpBB. I know that vBulletin frequently comes up in web searches."} {"_id": "56571", "title": "What integrated blog comments/forums solutions exist beyond Vanilla Forums?", "text": "For the past year and a half, I've been running Vanilla Forums on my WordPress-based site My favourite feature of Vanilla is that it integrates comments from the blog's front page into the forums, giving users the continuity of a single sign-on. The downsides are that Vanilla loads slowly, performs rather poorly on iPads (a source of irony given my site's subject matter), randomly logs users out with some frequency, and generally annoys the forum regulars. I would like to replace Vanilla with different forums software. Ideally, this new solution would: * Integrate with blog comments and * Import our 1.5 years' worth of data from Vanilla. Is there anything out there that would fit my needs?"} {"_id": "7668", "title": "Forum software alternative to phpBB3", "text": "> **Possible Duplicate:** > What Forum Software should I use? I've been using phpBB3 for quite some time now. It seems to me this forum software hasn't evolved at all in all these years. Installing mods is a hassle, updating it to a newer version a real pain in the arse and moderating is not intuitive at all. Besides, I find there's just no way to stop spam on it. Lots of web software have made a great job controlling spam, but phpBB3 still doesn't, at least not without too much complex and tedious work. Since my last attempt to update to the latest version broke it, I'm finally fed up with it, and decide I'm not wasting a minute more in mantaining such a beast. I'm looking for a free software (free as in free beer and free as in free speech) alternative. So SMF is not an alternative at the moment. The most important feature I'm looking for is there must be a script to migrate all of the current phpBB users and posts into the new system. Out of all the alternatives out there, does any of them support these features? Which one do you recommend?"} {"_id": "68979", "title": "Wrong sign-up tracking when people use Facebook connect", "text": "I have an issue tracking sign-ups for people signing-up using Facebook, Google and Twitter connect. For example, each time a user signs-up using Facebook connect, the referral to this sign-up will be Facebook and not the real source of traffic. We can see that via the referral path /dialogu/oauth that is specific to Facebook connect. In addition, the landing page is directly the signup thank you page, like if the user would arrive directly from FB connect and sign-up. (see screenshot below) ![enter image description here](http://i.stack.imgur.com/bnirv.png) We were thinking about excluding the referral path of the social media connect (like dialogu/oauth for Facebook) but I wanted to have a confirmation that it will solve the issue and not create a new one."} {"_id": "28382", "title": "How do I migrate a MyBB forum to Vanilla?", "text": "I am running MyBB 1.6 and want to convert to vanilla forum. There are no converters that I know of. Is there a way to do it using a middleman forum that is supported by both ? eg `mybb -> phpbb -> vanilla`? I am also willing to do this manually if someone can point me in the direction and start me off, is it a matter of just changing SQL column names and such?"} {"_id": "42886", "title": "How to let Google know that content on the page has just changed?", "text": "I'd never thought of such a problem in the whole SEO spectrum before I considered launching my own website. Imagine that I have a website which is updated once per month and Google spider knows about that. So what happens if I've updated one of my pages with new content just after it was crawled? Do I need to wait for a month before Google spider will come back? And is it possible that someone can find my website by queries which are not already relevant to it? Has anybody encountered such problem or has a solution?"} {"_id": "53994", "title": "How to find out when a \"example.NAME\" domain was registered on?", "text": "I am attempting to obtain the registry records of a `.name` domain - I have tried using who.is to obtain details of whom has registered this practical domain and when was it created and when will it expire however when using this whois checker I do not get any results. How can you get the whois information from `.name` domains."} {"_id": "13607", "title": "iPod / iPhone CSS Website Template w/ Height Width Dimensions", "text": "You can see here a (very) simplified version of my website: http://ple100.free.fr/foo And on iPhone, it looks like this: http://ple100.free.fr/foo/iphone.png As you can see, we don't see the right border of the page. And we have a black border on the left... I'd like it to be like this: http://ple100.free.fr/foo/iphone2.png Do you know how to solve this problem? Here is the CSS: http://ple100.free.fr/foo/style.css"} {"_id": "22674", "title": "Looking for PHP/MySQL-based ad manager", "text": "Could you recommend based on your experience a PHP/MySQL-based admin interface for managing your website ads? In order to be really useful, such application should have: - **basic CRM functionality** to track who is providing the ads - **multilingual multi country support** : have the ability to specify for the same ad, different versions for multiple languages/countries - **predefined ad formats** (google Ads, flash ads...) **andsizes** with corresponding PHP helpers so as to insert in the HTML code the necessary markup to properly integrate the ad. Ideally if that application could be desgined for Zend Framework that would be awesome (but I think I'm dreaming at this point)."} {"_id": "10888", "title": "Godaddy multiple domain problem", "text": "I have godaddy deluxe plan and here is my problem: I have two domains for example: e1.com and e2.com. Both are hosted in same hosting plan. First I created a folder for each domain in the root folder and uploaded two web site but when I'm trying to run my sites, the URL for e1 always shows `http://e1.com/e1/` and for e2 it shows `http://e2.com/e2`. Can I avoid showing e1 and e2 folder and only show `http://e1.com` and `http://e2.com`? Thank you."} {"_id": "22670", "title": "where can I get list of google excluded words, like, to, is, am, in, of, the, ?", "text": "where can I get list of google excluded words, like, to, is, am, in, of, the, ? When a search is performed in google? I hope you have understood what I want .. about the list of google excluded words when a user search in google?"} {"_id": "33305", "title": "link rel=\"alternate\", multiple languages and canonical urls", "text": "Context: We've got a website which is available in multiple language versions. However, the content is the same. Each translated version of the site is available on a distinct subdomain. On some pages, we use `` to point to canonical version in the same language of the same page if necessary. This is just basic normalization stuff, in this case ordering of tags. The problem is, we could have a page where a `` is present for normalization and additionally a bunch of `` tags to point to different translations. What does Google do in such a case? We do not want to be punished for duplicate content, but do not want to lose the different language versions of the page in the Google index. We've searched a lot and couldn't find anything which addresses our case and some resources were confusing about this topic."} {"_id": "30385", "title": "How can I get Google to remove links to my site from their index and then recrawl my site?", "text": "I have put my web site up on the internet and google has pretty much completely indexed it. I have changed my url structure so now all the pages google has indexed are 404. Is there a way I can get google to delete all these pages and then"} {"_id": "60900", "title": "Iframe display old file", "text": "I am using MVC 3. I am opening new tab on button click and in new tab i open one view in which have on iFrame and iFrame load pdf file. This is work fine but When i open again then it display old file instead of new file. File name is same but file content is changed. **Updated** If pdf file name generate random then its work fine but i don't want to do this."} {"_id": "65793", "title": "What is the best pratice to pagination to avoid duplicated content?", "text": "The SEO department for the company I work for requested that the user comments on all pages should be paginated, and by this, generate a new URL, for example: **Original page:** foo.com/bar.html (load first 10 comments) **Paginated page:** foo.com/bar.html?page=1 (offset the 10 previous comments, and show more 10 comments - and so on) And when the user be in a paged URL, the canonical tag must point to the child URL, and not to the main URL. I honestly think this is not good, because of these reasons: 1. Comments made by users in most of times are useless content. And although I can't prove it, I'm pretty sure Google can detect what is your article and what is the comments on the page. So I think comments needs to be loaded via JS, and not printed on the HTML. (It would also reduce the page weight); 2. I think the canonical should be pointed to the main URL - and not the child URL. Since it's the very same content on every child page, and the only difference is that it will load new comments, but the main content is the same. 3. I would also add a NOINDEX, FOLLOW tag to every child page in order to make sure Google do not index any of my child pages and always prioritize the main URL. What are you thoughts about it?"} {"_id": "60906", "title": "When does Google show images in search results", "text": "When making a search on Google for something rather generic, Google might show image results above the list of organic search results websites, or in a top position. There would also be a link like \"More images for [search terms]. Are there any generally accepted rules regarding when Google shows images? What kind of searches does Google consider relevant for image results?"} {"_id": "50672", "title": "Is it beneficial to use more than one website analytics application for a single website?", "text": "For example, mixing: Woopra, Google Analytics, and Clicky. They all offer free services, and this way I could have more information. Are there any problems with this (e.g., slowing down the website, conflicts between the tools, etc...)?"} {"_id": "26568", "title": "Personal Web Page", "text": "I recently got a domain and am considering hosting a personal web page. I did some research and narrowed my options down to either `Google AppEngine` or `GitHub` pages to host my website. The website is just a simple personal data website with `About`, `Contact`, `Interests`, etc. Don't really have any plans for dynamic content but who knows... Anyways, recently I have seems a lot of one/two page websites that follow a similar template / design (particularly in mobile software and geeky personal pages, a rather simple example). I was wondering if there is a trend that I am not aware of. I assume there must be some popular frameworks that are being used or is it all just HTML5/CSS templates ? Also, in terms of web page design, the selected answer on this question: Beginning a Personal Web Site recommended a CMS. In contrast, this question Recommendations for a good personal/resume website recommends going for boilerplate HTML5 templates. Although I'm leaning towards HTML5 for their simplicity (and ease of use with the two hosting options mentioned above), I was wondering how each option compares in development time because I would imaging executing a CMS site would be as easy as writing blogs. Any suggestions would help ?"} {"_id": "15006", "title": "baffling comment spam", "text": "I've been seeing some odd comment spam on one of my sites. Odd because there are no links posted. Just, \"Wow, that's a really celevr way of thinking about it!\" or similar. Note the typo. The messages change but they almost always seem to have a typo. I'm wondering if that may be on purpose, perhaps as a way of tracking successful comments or some such (like subtly and uniquely altering several copies of a classified document to pinpoint who a leaker was). I've noticed that often one of the fields will be left empty, inluding the comment body. This suggests to me that this is a bot that's testing for honeypot fields. Anyway, the baffling thing about these is that there are never any links posted. I can't figure out what the point of this would be. Any ideas?"} {"_id": "27377", "title": "Prepare website for heavy traffic", "text": "A site of mine is going to be featured on a very popular tech blog. I already upgraded my server to handle the traffic. What other things should I be prepared for?"} {"_id": "26561", "title": "What can cause a connection reset when site is visited with Firefox?", "text": "I have a site on Godaddy and in the last few days whenever I visit a page on the site with Firefox I get a connection reset error first and I have to reload the page manually to load it. I haven't changed anything on the site recently and the problem occurs only with Firefox, not with Opera/IE/Chrome. Apparently, my visitors also experience this, because I have a huge drop in visits on the site. Is it a Godaddy problem? Something they did? But then why no other browser is affected, only Firefox? Or can it be an issue with the newest version of Firefox, so it's not Godaddy's fault? Anyone else experience such errors with Godaddy and/or Firefox?"} {"_id": "47159", "title": "Wordpress Category/Tag Pages As Indexed Actually Help In Traffic?", "text": "I several days ago somewhere read that category and tag pages should be noindexed. But later on somewhere else, I read that, they only increase rank for search term, but overall traffic decreases- said one from his experience. So, if I just concern about traffic, not about individual search term that much, should I make those category and tag pages as indexed to google? I am confused which is the best to do. Thanks in advance. Regards Rana"} {"_id": "65067", "title": "Dynamic news showcase on homepage and impact on SEO and indexing", "text": "I want to highlight the latest news of the week or month on the homepage of my website. On another page of my site, I have a complete and comprehensive list of the news that would duplicate what is on the homepage. So the news on homepage would be variable. In fact, I would have to remove each item after one month. Even if I have the same news URL links in another page, removing the URL links from homepage would affect heavily for Google indexing? Is there a possibility that I could lose position in the SERPs? This would practically make my site into a daily newspaper site!"} {"_id": "24709", "title": "How can I get stats for what 3rd-party sites have embedded our iframe widget?", "text": "Say we've produced a widget for other sites to use, like so: The client would like to be able to see within GA who has embedded the thing. Is there some referer information automatically passed that I can look for, or do I need to add something? `whatever.php` is already loading the analytics Javascript(we're also tracking clicks on an outbound link). **[EDIT]** Looking around a bit more, I found what seems to be a similar question on SO with an answer saying this can be found, automatically, but I still can't seem to find the information. The question's also old enough the respondent is probably referring to the old interface, though. Maybe someone could explain getting to it in the new look. (I won't likely be able to train this client to switch, deal with the old look, etc.)"} {"_id": "24704", "title": "Confirm that someone has Google +1 'd a link", "text": "I'm considering a campaign for my existing customers that would offer them a discount on products when they +1 our site. Is there a way to prove that someone has +1'd our site?"} {"_id": "59302", "title": "Sitemap.xml / Update Frequency + Priority 0", "text": "I'm working on building sitemaps and am not sure what should and should not be added. For example: `https://www.website.com ` `https://www.website.com/en/ ` `http://www.website.com/index.php ` `http://www.website.com/en/index.php` All of these URLs lead to the same page. _https://www.website.com/ is the preferred address._ **Should I write all of these variations in the sitemap and assign the nonpreferred ones an update frequency of never, priority 0. Or simply not include them in the sitemap at all?** Nearly every link on the site could be reached with a similar combination. Similarly, I have many links that contain parameters https://www.website.com/en/catalog?utm_source=Blog&utm_medium=CTA&utm_campaign=Test **Should these be in the sitemap with the same update frequency never, priority 0 or simply not included?** The documentation I've read, says that the sitemap is used to teach the search engine about your site and give priority to pages, so it makes sense to me that I would want to add all of these links so I can explicitly educate the search engine NOT to bother checking these links. ## Additionally The site contains a catalog of products which can be filtered through in multiple ways (by color, and size for example). Each filter changes the url like such: `The main catalog: ` https://www.website.com/catalog/ `filtered for blue: ` https://www.website.com/catalog/color-blue/ `filtered for large: ` https://www.website.com/catalog/large/ `filtered for blue & large:` https://www.website.com/catalog/large/color-blue/ With three types of filters and 5 to 15 options for each filter, this potentially creates 75 different links which essentially display the same content. Should all of these potential links be listed in the sitemap, only the main catalog link, or only the first level of filter links, or every potential URL?"} {"_id": "24702", "title": "Service for accepting short video clips as part of official documents?", "text": "I am looking for a service that can accept short video clips flexibly and convert to an archival format. So the process would be: * Person in the field uploads a video by: 1. Smartphone upload 2. Upload from camera via laptop 3. Upload from browser via Flash * Service normalizes the video to some standard format (AVI? with standard quality) * Service generates a thumbnail and embeds in a larger report * Thumbnail is clickable to get the larger video These are private, assessment videos, so they need to be confidential. Most of the online services out there (YouTube, Vimeo, etc) do parts of these, but don't have a commercial or archival component. Ideally the service would have an API, but if it satisfies the bullet points above a \"self-serve\" YouTube quality would be great. I did searching and mostly I've found components limited by browser (ActiveX controls and such) or not allowed to be private. I'd prefer to buy a service rather than reinvent this wheel. Thanks in advance!"} {"_id": "63104", "title": "Is there any way to simulate a slow connection between my server and an iPad (without installing anything on the server)?", "text": "Some of our webapp users have difficulty on slower connections. I\"m trying to get a better idea of what that \"speed barrier is\" so I'd like to be able to test a variety of connection speeds. I've found ways to do this on Windows but no on the iPad, so I'm looking more for some sort of proxy service that'll work with any device (not running ON that device) I did find an article about using the CharlesProxy and providing a connection to another device, but I was hoping for something simpler (need not be free) _Constraints_ * We are on a shared server so we can't install anything and we are limited in our control over that server. * I'd like to test an iPad, Android Tablet, Windows PC."} {"_id": "65268", "title": "Using an external mail sending provider for my websites?", "text": "Let's say I have made 20-30 websites, that send emails such as contact form responses, welcome emails, forgotten password reminders, and similar transactional emails. What is the benefit of using a third party email service provider to send these emails? I don't quite understand it _exactly_ yet, or whether or not it suits my needs, but I require reliable email delivery and want to assure that all emails are delivered correctly, setup in the right format, get delivery response, and minimize emails ending up in a spam folder. Would an external email service provider be able to do this better than sending email directly from the server itself? And could I the third party email service on my local server while building/testing sites?"} {"_id": "30383", "title": "Configuring osTicket to fetch mail from an email account", "text": "I'm trying to use osTicket to implement a ticketing system but I'm not able to fetch the mails from an email account even after configuring the host, port, user name and password."} {"_id": "65264", "title": "Communication between Amazon EC2 & S3", "text": "So here is the use case: Our company has customer data files stored in S3. Their website is hosted with a third party hosting company (Lets call it Hosting X). They are planning to provide a new offering where the customer can select a cloud space he has account with like dropbox, skydrive etc.Then the company will transfer the customer data to that cloud space account using the cloud space's API calls. We have done cloud uploads in the past, but the data servers were ours, so we just had a bunch of scripts running in the data servers. The scripts would fetch the data files locally and upload them to the desired cloud space. In the current scenario I am thinking the following are two possible options: Option 1. Download data files from S3 to Hosting X. Then run script in Hosting X which will upload the local files to desired cloud space. Option 2. Host the upload script in EC2. When customer requests a data transfer, a script from Hosting X will hit the upload script in EC2. The upload script in EC2 will fetch the files from S3 and upload to the cloud space. I believe Option 1. will have a recurring cost associated since every data transfer from S3 to outside AMZ infrastructure costs money. In any case I am not sure if these solutions are possible. Can EC2 and S3 communicate so? Will EC2 need to download the data files from S3 or can it access them as if they are \"virtually local\"? Anyone has any experience with such a scenario? Any suggestions, tips are GREATLY appreciated!!"} {"_id": "65265", "title": "Transfer domain fails due to Extensible Provisioning Protocol (EPP)", "text": "So I'm trying to transfer the hosting for a domain from Biz.nf to Bluehost. The domain was originally registered with Biz.nf. The domain is unlocked, and so is ready to transfer. However, I need the EPP, and that is where the problem comes in. For whatever reason, the current domain registrar is Wild West Domains (GoDaddy), despite the fact that I have never dealt with them. How exactly did this happen, and what can I do to get the EPP."} {"_id": "8296", "title": "Menu icons in Drupal?", "text": "How do I get menu icons in drupal? I have the menu icons module....but are there menu icon themes or packs or something? I am using the deco theme, and want menu icons like thus: ![alt text](http://i.stack.imgur.com/YprMH.png) What is the easiest way to do this?"} {"_id": "65263", "title": "Thousands of backlinks from Meetup.com", "text": "I have been wondering something for a while now. We have a manual penalty from Google on unnatural linking. When I check out \"Links to your site\" in Webmaster Tools we have 11,387 links from Meetup.com. Is this harming us? After clicking through some of the links randomly I have discovered it is from one of our dealers that sells our products. He is sponsoring different groups and so the ad appears, the problem is he is linking back to our site and not his own. The anchor text is all the same it never changes."} {"_id": "49142", "title": "Attaching an Google AdWords conversion to a campaign", "text": "In Google AdWords, I've created a conversion and put the script onto my webpage but now I don't know how to connect this conversion to a campaign, so that it shows up on the table for the campaign data. How do I do that?"} {"_id": "27727", "title": "How can I set the default page for an https request?", "text": "We have a website which has a Virtual Directory containing the secure portion of the website. If users come to `http://www.mydomain.com`, they should get directed to `default.aspx` of the main site, but if they go to `https://www.mydomain.com`, they should go to `default.aspx` of the virtual directory. The default page for the main site works fine, as does the secure page if I navigate to it using the full name, however I can't figure out how to set the default page for https traffic that doesn't specify a specific page. 1. `http://www.mydomain.com` - Works 2. `https://www.mydomain.com` - Page Not Found 3. `https://www.mydomain.com/myvirtualdirectory` - Page Not Found 4. `https://www.mydomain.com/myvirtualdirectory/default.aspx` - Works What do I need to do to make links 2 and 3 load the default page show in 4? My website is running on IIS 6.0 in Windows Server 2003"} {"_id": "22012", "title": "Rel = translation", "text": "I can't find much online about rel=\"translation\" We have tutorials and manual entries which we are going to get users to translate. If the original page in English is: http://www.scirra.com/tutorial/start And there are two translations: http://www.scirra.com/tutorial/es/start (spanish) http://www.scirra.com/tutorial/de/start (german) How would I correctly link all these up? I'm aware at the top of the page I would need to specify the correct IS639-1 code: But I'm more interested in letting Google know they are not duplicates but are translated."} {"_id": "22016", "title": "Strange issue with Wordpress sites, is it PHP Memory?", "text": "This has happened to me twice with the same host and I want to know the real cause. I have multiple wordpress sites hosted on a shared server. One day when I attempt to visit any of the sites, the webpage simply downloads the index.php file. It happens on all wordpress sites but not on static sites hosted there. I understand this is a php issue on the server, but what could be happening specifically? the only thing I could find when searching is something to do with memory limits. Is this common? Should I be worried about this host?"} {"_id": "27898", "title": "Abnormal Alexa score: the more popular site has a worse rank", "text": "I have 2 websites of ecommerce in France and for these websites I have a strange behaviour of the alexa results. Here are some statistics about the websites : **Unique Visits January 2012** * Website A : 158,828 * Website B : 58,867 **Number of Search Results google** * Website A : 5,100 * Website B : 56,000 **Links to my site** * Website A : 3,120 * Website B : 2,180 **ALEXA Score** * Website A : 405,804 * Website B : 278,944 How does it come that website B with 1/3 of the visitors of website A have a much better Alexa Score ( x2 ) then website A?"} {"_id": "43975", "title": "Google Authorship not displaying", "text": "A few months back I set to the task of verifying my blog content for Google Authorship. I'm fairly confident I've ticked all the boxes and when I test my posts with the Google Structured Data tool it informs me that Authorship is working for this webpage. It has been like this for around 2 months but I am still not seeing my authorship profile in the search results. Do a search for \" **jQuery Parallax Scrolling Tutorial** \" and you will see my blog post near the top of the results but no authorship data. _**Can anyone suggest why this is not appearing when Google itself tells me that everything is in order?_** Thanks"} {"_id": "53954", "title": "Why has my authorship image been removed from Google SERPs", "text": "I recently added the relevant tags to my site and validated my email for Google Authorship. After around a week my image started showing up with results. Today the image is no longer present and I haven't changed anything. The Structured Data Testing Tool results show that it should all be working. Have I missed something or does anyone know why it would stop displaying? Other questions ask about situations where the image has never appeared. My image has appeared but has since stopped appearing so is a different situation."} {"_id": "55972", "title": "Google Authorship - mugshot not showing", "text": "I've set up Google+ authorship for my photography blog, and used the Google Structured Data Testing Tool to verify the following: * Authorship is working * Authorship email verification is working * rel=author markup has successfully established authorship for this webpage. * Publisher markup is verified for this page Unfortunately my authorship information is not showing in Google search results. Could it be that the latter on my list above (Publisher markup verification, which is set to use my _business_ page as opposed to my personal page) could be causing some form of interference? It's my understanding that blog posts may only be linked to a Google+ personal profile, not a business page, but that you can set up Publisher markup for a business page."} {"_id": "44002", "title": "Microformatting accepted, then ignored", "text": "I have implemented microformatting in one of our websites for our address, breadcrumbs and reviews that we collect from our customers. After we launched the new website using the microformatting Google accepted all and produced perfect formatted rich snippets. After about a month all got ignored and queries returned 'normal' snippets again. The Rich Snippet Testing tools indicates all is marked up correctly. Is there anything else I can do? What is the best practice in this matter?"} {"_id": "2308", "title": "What is \"?sfgdataq\" that I see appended to some requests to my application?", "text": "I have noticed that a small number of requests come through my application with ?sfgdataq appended to them, I probably wouldn't have ever noticed it if it didn't cause an error on some requests. I have seen requests come through with a mix of user agents, (Firefox, IE 7, IE 8) so I don't think it is a bot. I looked around and I can't find any information on what is doing this, I found a couple people with similar questions on message boards (so I don't think it is isolated to my application) but no good answers. edit: I also noticed that the User Agents have some sort of large random value appended: +sfgRmluamFuX1R5cGU9amF2YV9zY3JpcHQmRmluamFuX0xhbmc9dGV4dC9qY is a sample - searching for that on google turns it up in other logs as well."} {"_id": "49063", "title": "Putting duplicate phrases in the headings of pages", "text": "If I have a local business that repairs only HP computers for example, and I write lots of good informative pages about all the different things that can be repaired on HP computers such as motherboards, screens and keyboards, should I make the `h1` headings of my pages: * Motherboard repair for HP computers * Screen repair for HP computers * Keyboard repair for HP computers or * Motherboard repair * Screen repair * Keyboard repair ? What different does it make to search engines? If the first option is a bad idea, how should I make sure to communicate to search engines that my business specializes in HP computers?"} {"_id": "13619", "title": "What to set for AllowOverride & Options", "text": "What is normally used as settings for both production or development servers for settings `AllowOverride` & `Options`? I will be needing to use .htaccess for WordPress so things like mod_rewrite etc"} {"_id": "43464", "title": "DMCA service on Bing search engine", "text": "Does Bing offer a similar service to the one Google search engine offers below? https://www.google.com/webmasters/tools/dmca-notice?hl=en Thanks."} {"_id": "43465", "title": "More preview lines in Google search results", "text": "Is there a way to come to enjoy more than 2 lines of results query preview per item on a google results page?"} {"_id": "4725", "title": "Merge two existing forms together, Add PHPBB2.x posts in to an existing SMF1.1x", "text": "I have two community website each running their own forms and I would like to merge the content of both forum in to one forum. Specifically from (PHPBB 2.x) in to a SMF (1.1). Both forums have been running for years now and have built up 100k of posts each. I found a converter that converts PHPbb 2.x to SMF here http://download.simplemachines.org/?converters;software=phpbb But when I install and run this script I get the following warring _All or some of the data in your installation of SMF will be overwritten._ So what I am really looking for is a forum merge not a convert. Suggestions on how to do merge two forums databases together in to a single database.?"} {"_id": "13615", "title": "Whole site caching on MediaTemple's ProCDN - cname setup", "text": "Am unable to properly configure ProCDN to point to my website's root. I have a completely static website: http://thaifood-recipes.com no PHP, MySQL, etc. Just static HTML, images, CSS, etc. I have followed the Getting Started with ProCDN guide at MT, but instead of offloading some content from a subdomain, like cdn.thaifood-recipes.com, I am wanting the whole site to just be served, as its all static, cacheable content. MT Support dragging the heels getting back to me with a direct answer. Any ideas or alternative CDNs to do the same thing if ProCDN won't allow? Cheers, Leon Stafford leonstafford.com"} {"_id": "43077", "title": "How can I change the location of my ip address to specific citiies/places when browsing?", "text": "I am based in Europe and I would like to test my website as if I am located in New York. I have some specific features that will be visible based on cities."} {"_id": "43469", "title": "which favicon gets loaded in the browser?", "text": "currently i am experimenting with 2 favicons and including them in 3 different ways in my markup, however i am not sure which one gets loaded in the browser. firebug's network tab doesn't tell me which one it downloaded. where favicon.png is 144x144px and favicon.ico is 16x16px. idealy i want all browsers to use the png instead of ico whenever they support it, and just fallback to ico for ie. i am not sure how to verify if the code is already doing this or not."} {"_id": "4721", "title": "What tools to use for efficient link building?", "text": "As most SEO experts keep saying, it is not just the content that you have - but also a hefty amount of _quality_ incoming links to your content that is important - these are the two ways to get to the top of the search results. The question is where do I find the incoming links? One way I know is Google blog search, it can be used to find blogs with related information to your content and some allow to leave comments. The comments usually consist of your name, e-mail and website. If you put your keyword instead of your name, then the keyword turns into a link to your website. Unfortunately most blogs put the `rel=nofollow` attribute on such links, but some blogs don't do that. What other ways are there to **find quality pages to put keywords links** back to your website? Quality link usually means: * located on a page with relevant content * does not have a `rel=nofollow` attribute in the `` tag * has a relevant keyword as in `keyword<` * the page with the link has high PageRank (3+) and TrustRank"} {"_id": "4720", "title": "Suggestions for Document Archive Site", "text": "I have been charged with documenting our organizations paper assets that roll back to the early 1600's and was wondering if there was a online platform already existing for this type of website. It would mainly consist of scanned versions of the documents, it's transcript (as some of the writing used - it my as well be written in double dutch!) and some method of tagging each document to a particular title. My initial thoughts were along the lines of a Joomla system, with relevant tagging and other plugins installed? Thanks!"} {"_id": "4723", "title": "What are natural growth rates for incoming and outgoing links?", "text": "Appears to be no more than 10-15% per month, is that correct? Also, along the same lines, what the max amount of outbound link a site can start out with and not be canned as spam."} {"_id": "13613", "title": "Twitter Tweet or TwitterMeme button?", "text": "The Twitter Tweet button and the Tweetmeme Retweet button seem to pretty much do the exact same thing. Which should I use and why? -- It seems that Tweetmeme is now recommending the Twitter Tweet button. But I still see lots and lots of Tweetmeme buttons around the 'net."} {"_id": "16065", "title": "Free Blog site that allows PHP?", "text": "I was wondering if anyone knows of a free blog site (ie wordpress, blogspot, livejournal) that allows people to include php into their posts? Any leads would be greatly appreciate. cheers!"} {"_id": "57023", "title": "Suitable ad service / Ad-revenue model for my one-time use high traffic website?", "text": "I'm about to launch a website to which I expect a high number of visitors and lots of traffic. However, this traffic will only last for a few days once the joke has been passed around enough (in the past, my sites like this have gotten up to five million hits). I've never advertised on them (they've been purely for fun or for some subliminal political message) but I was thinking perhaps there was money to be made. I checked out AdSense, but they have a long approval process, which the timing for my website release doesn't allow. Does anyone have any recommendations of an advertising service I can employ to make a few bucks on my site's surge of traffic, and that can be set up quickly?"} {"_id": "47921", "title": "Alternative to Google Adsense", "text": "I have a web dev blog which averages ~100 uniques per day, on which I have a few ads here and there via Google Adsense. These ads generate around \u00a35 per month for me. While I'm not out to make money, it would be nice to be able to cover my hosting and domain costs. Can someone recommend a good alternative to Google Ads which could bring in a little more? If so, could you also detail your experience with as much info as possible, please? Again, I'm not looking to make money off my blog but if I have ads on there, I may as well get the most out of them and if I can turn that \u00a35 a month into \u00a310 a month, it's a good start. I'm very naive to the world of online ads and any search on the topic returns clearly promoted posts with impossibly high numbers."} {"_id": "5844", "title": "My readers don't have Digg or StumbleUpon, how can I ask them for help raising my SERP?", "text": "I have a coupons-code web _page_ that receives 40,000 unique visitors a month, but its numbers are stagnant or dropping because it is #3 in Google's SERP. 75% of its visits are new and the page has been around since 2007. I sincerely think my resource is helping people and I sincerely think I am loads better than #1 and #2 because I have the freshest coupons and I have no ads -- so I am at my wit's end on how to raise my SERP. The rest of the sites in #4, #5, etc., are huge coupon broker sites so I really have no chance pursuing link exchanges. Meanwhile, the site in #1 and #2 is also a small guy who also have been around since 2007. None of them update their coupons page near as often as I do. (The #1 guy updates once a month. :( ) My readers do not use Digg, StumbleUpon or Twitter, or have blogs of their own. I tried Facebook's Like button and tweet reader comments via bit.ly but got no SERP improvement. How can a site ask readers help to raise its SERP? Or any suggestions?"} {"_id": "5843", "title": "Why don't my Google Analytics custom segmentation visit numbers match up?", "text": "I have three main areas of my site and want to track total usage as well as breakdowns of the three parts. I am trying to segment the \"type\" of use on each page using a custom variable as such: ['_setCustomVar',1,'Visitor Type','Unknown',1] Visitor type can be one of three values: \"Unknown\", \"Reader\" or \"publisher\". Every page has this value set. Now when I look at my analytics chart and chose all three segments, the individual values do not match the sum. I've double checked the pages to make sure the custom var is there. ![Google Analytics Screenshot](http://i.stack.imgur.com/vTUXK.png)"} {"_id": "16062", "title": "What does \"Other\" mean for Traffic Source on Google Analytics?", "text": "I logged into Google Analytics today and under 'Traffic Sources' saw: \"Referring sites\", \"Direct Traffic\", \"Search Engines\" and \"Other\". I've never seen 'other' before, and it was small (only 0.5%), but what does \"Other\" mean?"} {"_id": "11281", "title": "Display Google Custom Search results on your own site", "text": "Is there a way for Google's Custom Search engine to show it's results on a page on your website? On a custom page which you put on your server, so you can still display ads and links to pages in your site together with the search results?"} {"_id": "2335", "title": "What is your favourite javascript lightbox implementation?", "text": "There are so many implementations of the lightbox, which is your favourite to use for your custom site? (javascript based) I generally use Lightbox2"} {"_id": "19708", "title": "Which method is the best to specify the language of a page?", "text": "There seems to be two ways to specify the language of a page: and Which one is the preferred way? I know I could just add both tags but I'd rather not have duplicate content. Also do both methods use the same locale format (i.e. \"en-us\" and not \"en_US\")?"} {"_id": "5849", "title": "Google Traffic Does Not Add Up", "text": "I have just done a search for two different keyword phrases. When I look at both of the results, they have the same number of \"Local Monthly Searches\" according to the Keyword Tool in Google Adwords which is 90,500. This is an \"exact\" match for the keyword phrases. I currently and have ranked in the #3 spot for both of these keyword phrases for almost a year now. TRAFFIC RESULTS - One site gets 2500 visitors a day and the other gets 100 visitors a day. WTF? Why such a disparity in the number of visitors per day when both sites are ranking #3? It doesn't make me trust the numbers from the Keyword Tool in Google Adwords when they are so ridiculously off like this."} {"_id": "5848", "title": "How to create an overlay div using CSS only (no Javascript)?", "text": "I'm trying to figure out how to create a div that overlays the page, staying in one place when the page scrolls, such as used in this article: Top 5 Botched PC Game Launches. (You will see a bar at the top that doesn't move, with the logo, _where gamers call home_ ). I'm not even sure what the correct terminology is - it's probably not overlay :) But whatever it is, it appears to be done with CSS only, as it shows up when Javascript is disabled."} {"_id": "17620", "title": "How do you track a website using multiple Google Analytics accounts?", "text": "Suppose you have two distinct Google Analytic accounts. How are you able to send request data to both accounts on a single page load? For example, on a normal account you'd include code such as: var _gaq = _gaq || [\"UA-12345678-1\"]; _gaq.push(['_setAccount', '']); _gaq.push(['_setDomainName', 'none']); _gaq.push(['_trackPageview']); (function() { var ga = document.createElement('script'); ga.type = 'text/javascript'; ga.async = true; ga.src = ('https:' == document.location.protocol ? 'https://ssl' : 'http://www') + '.google-analytics.com/ga.js'; var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(ga, s); })(); Could I simply include another snippet with a different UA ID? Can I pass several UA IDs in one request? Is this even possible to accomplish? Any help would be greatly appreciated!"} {"_id": "56826", "title": "Do I set a DNS A Record for the new GitHub Pages to use their CDN?", "text": "GitHub updated their Pages service for custom domains yesterday and I'm not clear on one small detail. They recommend a CNAME record for `www.example.com` but do not recommend using A records for the apex domain. I want to make sure both `example.com` and `www.example.com` resolve correctly to `www.example.com`, and **also get the benefit of GitHub's new CDN**. Do I use a CNAME for www only then, with no A record at all? This feels like a dumb question as I type but I'm just not getting it."} {"_id": "17622", "title": "How do I get a page on my site to appear in the \"video results\" section", "text": "Direct Lyrics appears at the top of these search results as a \"video result\" (next to an inviting thumbnail). But the page in question is just a page of lyrics with an embedded YouTube video. My lyrics site has embedded YouTube videos on its lyrics pages as well; how can I get my site into the video results section? ![](http://i.stack.imgur.com/mcuJT.png)"} {"_id": "4803", "title": "The Sitemap Paradox", "text": "We use a sitemap on Stack Overflow, but I have mixed feelings about it. > Web crawlers usually discover pages from links within the site and from > other sites. Sitemaps supplement this data to allow crawlers that support > Sitemaps to pick up all URLs in the Sitemap and learn about those URLs using > the associated metadata. Using the Sitemap protocol does not guarantee that > web pages are included in search engines, but provides hints for web > crawlers to do a better job of crawling your site. Based on our two years' experience with sitemaps, there's something **fundamentally paradoxical about the sitemap** : 1. Sitemaps are intended for sites that are hard to crawl properly. 2. If Google can't successfully crawl your site to find a link, but is able to find it in the sitemap _it gives the sitemap link no weight and will not index it!_ That's the sitemap paradox -- **if your site isn't being properly crawled (for whatever reason), using a sitemap will not help you!** Google goes out of their way to make no sitemap guarantees: > \"We cannot make any predictions or guarantees about when or if your URLs > will be crawled or added to our index\" citation > > \"We don't guarantee that we'll crawl or index all of your URLs. For example, > we won't crawl or index image URLs contained in your Sitemap.\" citation > > \"submitting a Sitemap doesn't guarantee that all pages of your site will be > crawled or included in our search results\" citation Given that links found in sitemaps are merely _recommendations_ , whereas links found on your own website proper are considered canonical ... it seems the only logical thing to do is _avoid_ having a sitemap and make damn sure that Google and any other search engine can properly spider your site using the plain old standard web pages everyone else sees. By the time you have done _that_ , and are getting spidered nice and thoroughly so Google can see that your _own site_ links to these pages, and would be willing to crawl the links -- uh, why do we need a sitemap, again? The sitemap can be actively harmful, because it distracts you from ensuring that search engine spiders are able to successfully crawl your whole site. \"Oh, it doesn't matter if the crawler can see it, we'll just slap those links in the sitemap!\" Reality is quite the opposite in our experience. That seems more than a little ironic considering sitemaps were _intended_ for sites that have a very deep collection of links or complex UI that may be hard to spider. In our experience, the sitemap does not help, because **if Google can't find the link on your site proper, it won't index it from the sitemap anyway.** We've seen this proven time and time again with Stack Overflow questions. Am I wrong? Do sitemaps make sense, and we're somehow just using them incorrectly?"} {"_id": "56778", "title": "Do I need to produce an xml sitemap for a social network?", "text": "I'm working on a social network that currently does not have an xml sitemap that has been submitted to Google Webmaster Tools. Will this add much benefit given that it is a social network with often updated content? If so will I have to update it daily (or use a tool to do so)? Thanks"} {"_id": "49549", "title": "Do you really need robots.txt and sitemap.xml if I want bot to follow every link?", "text": "I wonder if I really need _robots.txt_ /*sitemap.xml* files if I want bots to go through every link on my website without any restrictions? And if I do, what should I write in there?"} {"_id": "58636", "title": "How to make Google index all sitemap.xml records?", "text": "I have a _sitemap.xml_ index fine with 3 XML files in it with around 120 pages for indexing. A year passed and I still have 1/3 of it indexed. All the most important pages for indexing are in 1st sitemap file, however Google takes a similar amount from each of the sitemap files instead... How to make Google index all _sitemap.xml_ records? I have crawl rate set to MAX in GWT."} {"_id": "52052", "title": "How to aggregate events across multiple sites in Google Analytics?", "text": "I have several sites that I'm tracking on my Google Analytics account. These sites all have the same events and event actions. I'd like to generate a report for the same event action across all of my sites. Can this be done right from the GA interface, and if so, how?"} {"_id": "3749", "title": "Is there a googlebot equivalent for RSS feeds?", "text": "I have a lot of feeds generated on my pages - and it seems to let them go to waste (Googlebot does not seem to see them). The RSS feeds display correctly in the pages when viewed through a browser, so I know they are working correctly. Is there a way where I can somehow make the RSS feeds 'discoverable' by the world? (so I can drive traffic to my site?)"} {"_id": "3746", "title": "How to charge for banner ads in a web application", "text": "I'm developing a web application which contains banner ads, and I don't know how to teach my clients how much to charge their customers for banner ads. Can someone help with a book or business strategy?"} {"_id": "17628", "title": "What % of a page needs to be unique in order to prevent duplicate content penalties?", "text": "I have a site with uses a some content generated by myself, some from users, and some from a source such as wikipedia. The wikipedia sourced content is unmodified but makes up no more than 30% of the page and is never the entire wiki article, just an extract of a few paragraphs. How much new content do I have to add to the wikipedia source to avoid being marked as duplicate content and penalised in my SEO?"} {"_id": "57966", "title": "Error while importing .sql file in phpMyAdmin", "text": "Getting error while importing `.sql` file in phpMyAdmin please help. Error SQL query: CREATE TABLE `sbbleads_admin` ( `sb_id` BIGINT( 20 ) NOT NULL AUTO_INCREMENT , `sb_admin_name` VARCHAR( 255 ) DEFAULT NULL , `sb_pwd` VARCHAR( 255 ) DEFAULT NULL , PRIMARY KEY ( `sb_id` ) ) TYPE = MYISAM ; > MySQL said: Documentation > > 1064 - You have an error in your SQL syntax; check the manual that > corresponds to your MySQL server version for the right syntax to use near > 'TYPE=MyISAM' at line 6"} {"_id": "3742", "title": "Confirm (enter twice) pw and email at registration - good idea?", "text": "Many sites seem to require that their users enter their email and password twice during the registration process. How useful is it really? It seems it could be a pain in the neck for many users, for limited gains, no? Thanks, JDelage"} {"_id": "62959", "title": "Google Analytics: View for Affiliates: How to include only one brand in E-commerce module?", "text": "I am trying to create a view for a partner that is selling their products on our site. I assumed that filtering their brand name with **_E-commerce Item Name_** would result in the same report as filtering their brand name by **_Product_** in **_E-commerce_** > **_Product Performance_** report, but it didn't. How should I approach creating a view, which would include only e-commerce data related to their products?"} {"_id": "3741", "title": "Looking for usability studies of user authentication methods and interfaces, do you know of any?", "text": "Been having issues with stackexchange creating weird usernames for me and grouping them with my account, it appears to be a known issue, and emailed the stackexchange team... which leads me to my questions, I'm looking for usability studies of user authentication methods and interfaces, do you know of any?"} {"_id": "18321", "title": "rel=\"nofollow\" in img tags?", "text": "Does it make sense to have a `rel=\"nofollow\"` attribute in `img` tag? Do search engines' crawlers use this attribute in some way for calculating PageRank score? What is the practical sense for using it (if any)?"} {"_id": "4622", "title": "Nameservers at freehostingcloud", "text": "How can I change the nameservers at freehostingcloud? I got a message: You need to change the nameservers for your domain exampledomain.net. The nameservers should be ns1.freehostingcloud.com and ns2.freehostingcloud.com."} {"_id": "4624", "title": "Is Google Sitesearch better than internal search?", "text": "In terms of SEO is there any benefit about using Google SiteSearch over internal search (eg. Wordpress' own search engine)?"} {"_id": "4627", "title": "How to stop Google indexing both www. and no-www versions of my website", "text": "When I do a search of my site in Google it shows both: * `mydomain.com` * `www.mydomain.com` In my .htaccess file I added code to redirect the non-www version to the www version, but both versions are still showing in Google's search results. _**Is there anyway to have Google only show the www version?_**"} {"_id": "11019", "title": "Adding custom handlers", "text": "I have a number of D language programs (.d extension) and a compile-and-run tool (rdmd)--I'd like to configure Apache (version 2.2) to handle URLs such as \"mysite.com/d/hello.d\" by running rdmd with the requested resource. The programs do output valid HTTP headers. I've researched AddHandler and Action, but have been unsuccessful in getting them to do what I want, namely executing rdmd with the resource. I suspect that the problem is that rdmd needs to be in some special cgi directory. Can someone walk through the steps necessary to configure Apache on a Debian based system, such that it executes my files instead of offering them up for download? Directory specific ('/d') or site-wide execution are both fine."} {"_id": "39647", "title": "Redesigning a page while not using old images but also not losing rank", "text": "I have redesigned one of the pages of a website I'm current working on, and some of the images on this website, 2 to be exact, rank on the very first page of public google search results. Those pictures are okay, but have a nasty looking background, look a bit blurried. I removed the background, did a bit of work on them and they are better now. All of the other pictures are also indexed by google but don't rank so well. I have found that **certain file naming techniques worked better for ranking** than the ones previously used for all those images for example, so since that page was redesigned, I have a couple of alternatives: > 1 - Leave the old files on the server, and don't use them. Name the new ones > the correct way and hope they rank better. (Old pictures would probably not > be listed on newer sitemaps, since are not linked to, which may cause rank > loss?!) > > 2 - Name the new files the exact same name as the old ones, discarding the > old files (They would be the same image in most cases, but cleaned up). > > 3 - Name the new files with new names, 301 redirect old files to new ones. **What's the best alternative to achieve best results?**"} {"_id": "11755", "title": "The best approach to ranking well when site is NOT content oriented?", "text": "I'm just about to launch a website and I'm concerned about ranking well on Google's SERPs. I've done a lot of searching for information regarding this new change to their algorithm and all I keep turning up is basically \"write good content to rank well\". The way I understand it Google is now ranking sites with _useful and unique content_ better than other types of sites and my question is this: what about business related sites that are not content based and whose content will rarely change over time? The site I'm about to launch is concerned only with offering a video creation service, it's purpose to provide an online portal to customers so that they can order a video from our company. We're not in the business of writing content and the content on our site will be updated only in line with how our business model changes. In short, the site is NOT there to entertain people - it's there to serve a practical, business related purpose. My question is twofold: First of all, are we doomed to suffer poor rankings because we refuse to change our site into something that google thinks is _better than_ or _more useful than_ other sites? And secondly, if not then what might be the best approach to ranking well with such a site?"} {"_id": "11754", "title": "Site on 2 hosts", "text": "is there a way yo add 2 hosts per domain so if i host dies it automatically switches to the other one? or if its slow you can easily switch to the second host? Thanks!"} {"_id": "11752", "title": "Why is video streaming so slow from my site?", "text": "I have some .flv files on my site and I've been using JWplayer to embed them into my webpages. The embedding looks like this: &autostart=true\" allowfullscreen=\"true\" allowscripaccess=\"always\" id=\"player1\" name=\"player1\" src=\"/jwplayer/player.swf\" width=\"500\" height=\"350\" /> The file is coming from directory tmp and is is saved in a format *.flv. Recently, I have been noticing really slow streaming. The video will stop almost every 5 seconds. This is happening on multiple computers on multiple networks and I'm not having the same problem on different sites (i.e. youtube). I'm hosting the site through Yahoo Small Business and I have a lot of video files in the tmp directory. Could anyone recommend me a way to allow my videos to be streamed faster?"} {"_id": "11751", "title": "How to translate facebook like button on my forum?", "text": "I have facecook like button plugin on my vanilla forum. Although I added Finally, I've triple checked that the UA is the correct text. and yes, the global account is `-1` and the specific domain is `-11`."} {"_id": "36300", "title": "Google analytics not provided for 55% of total traffic", "text": "I've been here and here to learn what `(not provided)` means. Now the question is if what I am seeing in my Google Analytics stats for my website is considered normal (and whether I can/should do anything about it). Here are the statistics from one day, but other days are similar: 102 visits, 57 is from `(not provided)`, that's over 55% of unknown keywords. Is it normal to have it like that? Does google plan to do anything about it? In other words, what's the perspective? In my understanding, with this approach, as people switch to `https`, Analytics will stop being useful. Please correct me if I am wrong in my assumptions."} {"_id": "36307", "title": "PHP W3 Validator API, Is this good?", "text": "I was trying to find a way to see if my site's code was valid or not but I continuously going over to W3 Validator so I decided to make an \"API\" however it really isn't! I just wanted to know if anybody can find a better solution to the one I have made. This is what I currently use, with the usage of ?uri=http://www.mydomain.com : \") + 7; $End = strpos($URL, \"\"); $Title = substr($URL, $Start, $End-$Start); if(preg_match('[Invalid]',$Title)) { //Code is INVALID echo \"INVALID Source\"; } elseif(preg_match('[Valid]',$Title)) { //Code is VALID echo \"Valid Source\"; } else { //It Went WRONG echo \"\"; } }"} {"_id": "49172", "title": "Looking for a Domain Manager", "text": "I am looking for the best Domain \"management\" company. Have about 500+ domains at the moment and acquire more on a daily basis. Would prefer to have a full- service company take over the daily management and selling side of all domains and want to list them on afternic, godaddy, etc. and host them all through Moniker. Any suggestions?"} {"_id": "21931", "title": "Do hosting company and CMS matter?", "text": "Does it matter for SERP, where my site is hosted and whether it's created with the use of a CMS, and if yes, does it matter, what particular CMS was used?"} {"_id": "21280", "title": "SEO & Multilingual: would be this a good practise?", "text": "I am currently making a bilingual website and I'd like to get nice SEO results of course. **Here's my idea:** The internal links would be composed of the \"www\" subdomain so that people can share links regardless of their language. Anyway, their language is determined by the `HTTP_ACCEPT_LANGUAGE` PHP variable. So, they would see http:// www.site.com/mydocument/123 in their adress bar and never see any links like \"http:// fr.site.com/mydocument/123\" or \"http://en.site.com/mydocument/123\" The user can always switch the page's language thanks to links in the footer. The switching language link would be : http:// fr.site.com/mydocument/123 , and clicking on it would change his language session and redirects the user to http:// www.site.com/mydocument/123 **In case of a crawling bot:** I read that if the HTTP_USER_LANGUAGE variable was missing then it's a crawling bot. So, in that case, we set the defaut language as English. Each page, as I mentionned earlier, has a link for another language: On the page: **http:// www.site.com/document/1323** , the link **http:// fr.site.com/document/1323** can be seen by the bot and be crawled. * What do you think about this practise ? * Would I get good SEO results for each language ?"} {"_id": "42480", "title": "Why is AliasMatch not working?", "text": "Here is what we have: Alias /assets/ \"/home/virtual/public_assets/\" AliasMatch ^/~([a-zA-Z0-9]+)/assets/(.*)$ /home/virtual/public_assets/$2 AllowOverride All Here is the URI we are trying to match to: /~admin30/assets/js/tests.js The `Alias` directive works for our live sites, however, the `AliasMatch` which is supposed to match user directories does not."} {"_id": "42487", "title": "mod rewrite doesn't work when i added multiple domains", "text": "i have redhat linux server and i was using /var/www/html to display my site. By using the following directive rewrite mod was working fine. # Enable mod_rewrite engine RewriteEngine on # WITH mandatory 'www.' #RewriteCond %{HTTP_HOST} ^$uri\\.$tld$ [NC] #RewriteRule ^(.*)$ http://www.$domain$1 [L,R=301] # WITHOUT 'www.' RewriteCond %{HTTP_HOST} ^www\\.$uri\\.$tld$ [NC] RewriteRule ^(.*)$ http://$domain/$1 [L,R=301] After the addition of NameVirtualHost *:80 ServerName www.domainmaster.com ServerAlias domainmaster.com *.domainmaster.com DocumentRoot /var/www/domainmaster ServerName www.domainother.com ServerAlias domainother.com *.domainother.com DocumentRoot /var/www/domainother url rewrite doesn't work. Website files wasn't modified. .htaccess file redirects all traffic to index.php and index.php decides which page to display. Any ideas?"} {"_id": "21935", "title": "Can AJAX in a CMS slow down your server", "text": "I am currently developing some plugins for WordPress, and I was wondering which route to take. Let's take an example, you want to display the last 3 tweets on your page. ## Option 1 You do things the normal way inside WordPress. Someone enters the website, while generating the page, you fetch the tweets in php via the twitter api, and just display them where you want. Now the small problem with this is, that you have to wait for the response from twitter. This takes a few ms. NO real problem, but this is question is just out of curiosity. ## Option 2 Here you don't do anything in WordPress on the initial load, but you do have the API inside. Now you just generate the page, and as soon as the page is done on the client side, you do a small AJAX call back to the server **via a WordPress plugin** , to fetch your latest tweets. Also called asynchronously. Now the problem with this IMO is that you have much more stress on your server. For starters you have two HTTP requests instead of one. Secondly the WordPress core has to load two times instead of one. ## Other options Now I know there are a lot of other options: **1)** Getting the tweets directly via javascript, no stress on the server at all. **2)** Cache the tweets so they are fetched from the DB instead of using the API every time. **3)** Getting the tweets from an ajax call that is not a WordPress plugin. **4)** Many more. # My Question Now my question is if you only compare 1 and 2, which would be a better choice. **NOTE:** I am only interested in comparing 1 and 2. No other options. The plugin I am creating has nothing to do with tweets, it's just an example to illustrate the problem."} {"_id": "46655", "title": "What is g1sense.int.godaddy.com?", "text": "I recently looked at my list of referrers (ie, previous page the user was on), and one odd one is a link like `http://g1sense.int.godaddy.com:15871/cgi- bin/blockOptions.cgi?ws-session=3372311093`. The domain name just redirects back to `godaddy.com` though and going to the link doesn't seem to actually do anything. The odd thing is, I'm not the only one seeing this mystery referrer according to Google. Because it seems so widespread, can anyone tell me what this is? Is my site being blocked by a user or in some other way?"} {"_id": "42337", "title": "How to maintain unique user per paid account in a website with given set of Email ID and password?", "text": "For a website, there is a free and a paid membership option. Once the user pays some amount he gets a membership for a month for additional products that are not in the free account on the website. Now once the user pays and gets a membership, how to make sure that only one user accesses the account ? I just want that only one user should use a paid account and even if he gives the set of Email ID and password to his friends, they shouldn't be able to login. So what do I need to do for this ? There are many sites that offer paid membership options. Do they do anything special to restict one user per paid account ?"} {"_id": "46659", "title": "What is the syntax for putting a nofollow attribute into a javascript file", "text": "Matt Cutts states in this video: > It turns out that as we are executing JavaScript, we do look at the > attributes. So you can actually use Javascript and put, like, a nofollow > attribute on individual URLs. I'm not clear to me what the syntax for that would be. How would I put a nofollow on a link in a variable like this? var mylink = \"http://example.com/\"; or if it in AJAX like this? $.ajax({ type: 'post', url: '/path/ajax-handler/' + method, success: function(data) ..."} {"_id": "42332", "title": "Find visitors to multiple subdomains on single visit with Google Analytics", "text": "I'm working on a site that has quite the backlog of Google Analytics data for their site network. One of our big questions is whether people enter on one site and move to another (and if so, of course, how do these visits differ from single site visits). The hostname report (Audience > Network > Hostname) shows all the host names and I've setup Advanced Segments to get site-specific data. That all works great, but I'm really having a hard time figuring out how to find visits to multiple sites as defined by visiting more than one subdomain or the root site and one or more subdomains. I do see that other hostnames somehow come through when I apply one of the segments to the host name report. Which I can't say I expected. Is that the best way to see if people are visiting 2+ sites?"} {"_id": "55954", "title": "Black background", "text": "Background of my site is black. Every reloads, browser's window **_blink_** , because default setting of browser binary is \"white background\". ( Chrome & Firefox ) Is there an exit, but rebuild whole browser from sources?"} {"_id": "32445", "title": "Web pages with mixed ownership photos", "text": "I have a photo website. 15% of the photos belong to approved registered users. They agree my terms about uploading their images in my web pages. I include a photographer credit on right bottom corner. About identifying the site with google, every page contains a google+ button to MY google+ page it also contains I need some advice in order to respect google rules about my pages containing other photographers images not to be penalized because of possible duplicated or interpreted as stolen content. My concern is also about adding G+ links (to MY photo page) and Google publisher id would harm my site rank because of pages containing third-party photos."} {"_id": "20089", "title": "How is web hosting CPU usage measured?", "text": "Among virtual hosting providers, there is usually an amount of storage and bandwidth specified in the service description (example: 50 GB storage, 1 TB bandwidth/month) but no such thing for CPU usage. However, usually a clause in the agreement says something about using resources excessively. * How is this measured? * What is the measurement units? * What is being measured? * Is this a CPU usage percentage? * Does the peak count or the average (over a period)? * Excessive relative to what? * Since no measure is specified in the agreement, what can be done to avoid being squeezed into more expensive plans?"} {"_id": "20087", "title": "Are Dynamics CRM 2011 webservices accessible from outside?", "text": "I am working on a CRM 2011 implementation and we need to give access to CRM webservices (for instance, the Update Account) methods. This is because, our CRM needs to be feeded by _n_ third party websites. Can this be acomplished? Can the, original, native CRM webservices be opened to internet? Note: We think, as an alternative, to wrap the original web service within a custom one, so, we expose our webservice, lets say \"CreateAccount\" and, from it call the local CRM webservice. But. this is a thing that we **really love to avoid.**"} {"_id": "48555", "title": "What is the easiest reliable way to host a HTTPS enabled static page on a domain I own", "text": "I'm looking for a simple & reliable way to host a HTTPS enabled static page on a domain I own. The fist thing that came to mind was S3 static website hosting but there seems to be no way to enable HTTPS for that. What other alternatives are there?"} {"_id": "48226", "title": "How Mailchimp (and others) work out if email was opened, went to junk ect", "text": "I'm aware of read receipts in emails, but these mean that the reader has to agree to send back the read receipt. But with Mailchimp (and perhaps others) they have a dashboard to show you which people in your list opened your email, and also if that email landed in a persons junk box, how is this done, as it doesn't seem to rely on read receipts? I could image a system were you have a tracking pixel in the email, and you check whether the email tried to call the external tracking pixel from the server, but quite allot of email clients don't load images by default, so i wasn't sure how that could do it?"} {"_id": "20083", "title": "Should I store address in encrypted form?", "text": "What are the general rules relating to address fields and others. I only store password in encrypted form, the rest are not. Email is also used during login. Can someone give me a suggestion related to * email * address * city * zip codes and others you may thing of. I am the only admin at this point and no one else is viewing any record. What should be standard? As a side question, should I store address no matter what (or optional) since the website is about buying stuff?"} {"_id": "20082", "title": "Swift Mailer SMTP Mailer for PHP alternative", "text": "I'm trying to find a replacement for Swift Mailer as my website is on Dreamhost. They do not allow `fsockopen()` which apparently Swift Mailer uses. Does anyone know of a good alternative, or can point me in the direction of something to use or do?"} {"_id": "48223", "title": "Malware/Viruses on Server", "text": "I was contacted yesterday for a potential freelance job. The guy told me he has a WordPress site and that his host told him was hacked and there are code injected files in the install. The host has temporarily blocked all front end access to the site until these files are deleted or fixed. My question is in regards to the viruses/malware or corrupted files and opening an FTP or SSH connection to this hosting account. What risks are posed when opening a connection to this hosting account? Can these files potentially transfer any malware to my desktop and then again to other sites I open connections to? I am not familiar with this type of situation, and appreciate any insight into it."} {"_id": "45302", "title": "Are web Applications with SSO a portal?", "text": "I'm a bit baffled regarding the use of the \"portal\" keyword. My understanding is that a portal is a general web page containing several portlets, each of which is a rendering of data fetched from a different source. Example: iGoogle **Reason for my bafflement (it's a word, Ichecked):** We built a set of web applications/services using IBM Websphere Portal for our client (a university). Each of the university services has its separate set of web pages (no portlets). The apps/services are accessed from the university's website using SSO (Single Sign-On). Our client uses the word \"portal\" to refer to them, although the concept of \"portlets\" doesn't exist. So, is it a portal?"} {"_id": "5704", "title": "SEO for pages that load from database", "text": "I have a very stupid question. I developed a website that has only one page at the public html directory(index.php). all content are dynamically loaded as different pages. How can i SEO my pages for google? i mean like the what we type in this site are .............i believe stored in database and search engine find these pages like any static pages... Just a little tip would be appreciable.......!"} {"_id": "65776", "title": "How to do SEO for database driven web? Asp.net", "text": "Hi to everyone and thanks in advance for your time reading this question. Im kinda new on this SEO stuff and there is something that I can't figure out how it works despite all the readings that I did. I've developed a new website which allows users to add recommendations for every kind of bussiness, places, professionals, etc, so other people can see if travel to X place is worth it for example. From what I've read so far (and from what I've understood, which are actually two different things haha) its important to make user and search engines friendly URLs and have proper keywords, title and h1 tags. (I've read this post SEO for pages that load from database and a few more) That's not a problem because I can do that on server-side code, but what I actually don't understand is how Google shows pages that collect info from databases on his search results. Lets put an example just to detail this a little bit more. If we go to Google and search for \"abercruz paginas amarillas\" google show us a link that send us to a page from \"paginasamarillas.com\" which has all the details about that store. Instead if we go to Google and type, \"abercruz yousug\" (which is my page) nothing shows up. I know that it could take a few days to Google to index my page but I don't understand HOW could Google \"search\" into my database (or page, or whatever) to end with the correct URL which shows the details of that Commerce If someone could explain me that or help me a little bit understanding how to rank content that is generated from a database I will very grateful. Thanks again!"} {"_id": "25291", "title": "What is Easiest Way to Change URL Structure of Dynamic PHP Website?", "text": "I am a SEO guy and always facing this problem, that PHP sites make dynamic URL with IDs. This is not SEO friendly structure\u2014like this `/article.php?id=2987`\u2014and need to be changed like this `/why-to-hire-a-seo- specialist`. Is their any easiest way to change this type of URL structure?"} {"_id": "52019", "title": "How can a Google crawlers see PHP dynamic content?", "text": "I'm reading the Google SEO starter guide and on page 11 it says you should prepare a site for users and one for search engines in XML. My site is about vehicles, and vehicle is not in a separate folder. It gets loaded dynamically with GET and PHP \u2013 so can Google crawl this if it's \"not really there\"? The pages are there under `?GET` variables like `home.com/seevehicles?2009-camar` so am I doing it wrong or can Google also crawl this?"} {"_id": "22804", "title": "SEO purpose: Does Google prefer html sites than wordpress sites?", "text": "> **Possible Duplicate:** > SEO for pages that load from database Will html sites have any advantages over blog sites while getting analyzed by Google bot?"} {"_id": "57616", "title": "How should I create URLs for database generated content that can be indexed by search engines?", "text": "I have learned how to retrieve data from a database to a single page by using a form, so I would not need to create many pages to show the elements of a catalog. The concern that I have is how to make this database content available for search engines. I have read that I need my site to generate different URLs for each search result, but in my test, the webpage always has the same URL without any additional parameter. I know that sites like Facebook and similar generate particular URLs for each profile and they are indexed, but I don't know if \"physical\" files exist for them or how this is done. I have read similar questions to this, but still can't figure how to generate the URLs."} {"_id": "35117", "title": "Do Parallels Plesk Panel 11 have free inbuilt firewall?", "text": "I am new to Linux Dedicated server hosting and plesk panel 11 , i am looking for **inbuilt firewall module** . Is it comes with free in plesk 11 or i need to pay or plesk 11 doesn't support firewall?. I looked at the demo http://www.parallels.com/products/plesk/demos/, but i couldn't find any information about firewall in plesk 11."} {"_id": "57087", "title": "Rebranding: how do I redirect my website to a new domain, and blog to a subdomain?", "text": "I have a website that I am rebranding and using a new domain name for. Right now it is primarily a blog hosted at the root. I am going to be redirecting from `www.old.com` to `www.new.com`, and I also want to move my blog to `blog.new.com`. Currently `www.old.com` doesn't serve any subdomains. Based on the research I've done, it would seem that I need to use 301 redirects because these should be permanent. I just don't really know what I need to do to setup these redirects. Should I use `mod_rewrite`? My sites are hosted on Dreamhost so any Apache configuration that I would need to do I believe I'd have to do in an `.htaccess` file."} {"_id": "10393", "title": "Problem with DNS", "text": "I bought a new website, and the company gived me another free domain name, so when I asked for the socond they created it and they told me to change the DNS to look like the first one. It's been a week waiting for it to propagate, today when I type the url I got this error message : If you are the web site owner, it is possible you have reached this page because: * The IP address has changed. * There has been a server misconfiguration. * The site may have been moved to a different server. If you are the owner of this website and were not expecting to see this page, please contact your hosting provider. When I try to add the second domain to my cpanel (Addon domain) I get also another error : `The addon domain \u201cabcdef.com\u201d has been created.` `An account with that login already exists.` Do you have any ideas about this problem. Thanks. `EDIT` I tried to flush the DNS with `ipconfig /flushdns`, but It's not changing anything."} {"_id": "10395", "title": "How to effectively use an overseas SEO team?", "text": "My company is currently in contract with a 20+ person team in the Philippines, previously used for comment linking and guest blogging spun content articles. This is a practice that we're stopping, but we don't want to sever our team because they work hard, they're really cheap, and they produce excellent accounting and reporting of their actions. What are ways that we can best put them to use as a link generating or content generating resource? Their English is fair, but not of high enough quality to use them for any direct content creation. Thanks"} {"_id": "10397", "title": "What is hosting space and why is it increasing every day?", "text": "My hosting's disk space is increasing every day and I just wanted to know, why? I don't upload new files, but it still gets increased by every day. Anyone knows? Thank you."} {"_id": "59642", "title": "Content keywords are unimportant in Google Webmaster Tools", "text": "When I check **_Google Index > Content Keywords_** in Google Webmaster Tools, it shows lots of unimportant keywords like: to, is, are, from, in, on, we, you and etc. That's very natural these words are found more than others because it's so hard to find a sentence without one of these popular words. I'm wondering why Google is doing this? Does it make a problem for SEO? How to deal with that?"} {"_id": "58475", "title": "Is it advantageous for SEO to include structured data on meta-tags", "text": "On my site's home page, I have various products for which I display an image linking to that item's main product page. For each of these products, I could include a schema.org/Product tag, but to include the `itemprop=\"brand\"` and `itemprop=\"name\"` tags for each product, they would not be visible to the customer apart from being part of the img alt tags and I would include them as `` tags within the product div, next to the image. Therefore the meta information for the structured data markup would not be visible to the user. Is this disadvantageous for SEO, meaning I shouldn't markup the homepage with structured data, or is it preferable to include the structured data with the non-visible information in meta tags? The `itemprop=\"url\"` property for each product on the home page refers to the product's main page, which is why I thought including these tags would be beneficial to Google to see how the site is structured - I just don't want to appear spammy."} {"_id": "34385", "title": "Failure retrieving contents of directory", "text": "Currently I have a couple of websites. My problem is that if I login on 1 specific domain with any of my programs (using notepadd++, FileZilla and Netbeans) the program stops at the content listing. I had it correctly running, (I'm working on a project on this domain for more than a year now) and suddenly I broke it somehow. This only happens on 1 specific domain, all other domains (from other hosts) are working. My colleague (next to me with same ip address) is able to login on this domain. * Notepadd++ says: Failure retrieving contents of directory * Filezilla says: Failed to retrieve directory listing * Netbean popups: Upload files on save failed. (Because I have the setting upload on save enabled.) What I tried: * First I thought it's my firewall, I disabled firewall but no result. Also notice that all other domain are working. * Maby a blacklist with my ip address? No my colleague has the same ip address. Could anyone help me on this? ## Notepad++ Log [NppFTP] Everything initialized -> TYPE I Connecting -> Quit 220 ProFTPD 1.3.3e Server ready. -> USER username 331 Password required for domain -> PASS *HIDDEN* 230 User username logged in -> TYPE A 200 Type set to A -> MODE S 200 Mode set to S -> STRU F 200 Structure set to F -> CWD /domains/domain.nl/ 250 CWD command successful Connected -> CWD /domains/domain.nl/ 250 CWD command successful -> PASV 227 Entering Passive Mode (194,247,31,xx,137,xx). -> LIST -al Failure retrieving contents of directory /domains/domain.nl/ ## Filezilla log Status: Verbinden met 194.247.xx.xx:21... Status: Verbinding aangemaakt, welkomstbericht afwachten... Antwoord: 220 ProFTPD 1.3.3e Server ready. Commando: USER username Antwoord: 331 Password required for username Commando: PASS ******** Antwoord: 230 User username logged in Commando: SYST Antwoord: 215 UNIX Type: L8 Commando: FEAT Antwoord: 211-Features: Antwoord: MDTM Antwoord: MFMT Antwoord: LANG en-US;ja-JP;zh-TW;it-IT;fr-FR;zh-CN;ru-RU;bg-BG;ko-KR Antwoord: TVFS Antwoord: UTF8 Antwoord: AUTH TLS Antwoord: MFF modify;UNIX.group;UNIX.mode; Antwoord: MLST modify*;perm*;size*;type*;unique*;UNIX.group*;UNIX.mode*;UNIX.owner*; Antwoord: PBSZ Antwoord: PROT Antwoord: REST STREAM Antwoord: SIZE Antwoord: 211 End Commando: OPTS UTF8 ON Antwoord: 200 UTF8 set to on Status: Verbonden Status: Mappenlijst ophalen... Commando: PWD Antwoord: 257 \"/\" is the current directory Commando: TYPE I Antwoord: 200 Type set to I Commando: PASV Antwoord: 227 Entering Passive Mode (194,247,31,xx,xxx,xx). Commando: MLSD Fout: Verbinding verloren Fout: Ontvangen van mappenlijst is mislukt Sorry that it's dutch. ## Next edit I changed the transfer connection mode from passive to active and got it working. Does anyone know why my computer wants it to be active if I used passive for more then a year? So weird that only my computer needs this setting after this while of use the other setting..."} {"_id": "7218", "title": "BBCode or WYSIWYG editor for non-technical people", "text": "I am creating a forum for non-technical people, and I can't figure out if BBCode or a WYSIWYG editor is easier for them to use. The forum is for craftsmen, and I am therefor in doubt if a BBCode editor would be too difficult to figure out. Does any of you have any good suggestions on what to use?"} {"_id": "7219", "title": "Deploy repository to production server?", "text": "I have a hosted SVN repository with assembla. it has an address like: > https://subversion.assembla.com/svn/my-repository-name/ Using TortoiseSVN I can checkout from this repository and commit to it. But **how would I deploy to a production server?** (I have not created a production server yet, I want to understand how this is going to work first)."} {"_id": "31169", "title": "How to indicate a page is duplicate content when you control its body but not its head?", "text": "http://www.zcommunications.org/ready-or-not-can-bangladesh-cope-with-climate- change-by-hazel-healy is a copy of a page on our site created by the author, and links back to that page. Google's guidelines suggested to me that this would be enough for Google to recognise our page as the canonical one and that one as the duplicate, and thus to show our page in SERPs. However, the opposite has happened - if you search for the page's title you'll see the duplicate page shows up but ours doesn't. How can we prevent this? Since the author 'owns' the duplicate page on zcommunications.org she can edit the HTML body, but not the ``."} {"_id": "16931", "title": "What is a rough maximum amount of page views per month that an average shared hosting service should support in your opinion?", "text": "Sorry for asking such an in-exact question, but I would really love to know what professional webmasters consider to be the rough limits of standard shared hosting services (GoDaddy, MediaTemple, ThePlanet, et al). I realise mileage varies massively, but I spend a lot of time wondering when to move growing sites from shared hosting to some more robust like a services like a VPS. Would it be fair to say that once a site hits 50,000 page views per month, one should consider better hosting, or should a shared host be able to handle this?"} {"_id": "7210", "title": "Should i add rel nofollow to menu links?", "text": "Should i add rel nofollow to the links from my site menu ? (to the links like: home, about us, contact etc... that doesn't have no connection with the site niche) Does this helps? What other options i have?"} {"_id": "44652", "title": "Is it valid to have 2, 3, or more canonical tags?", "text": "I have seen a major website have more than 2 canonical tags on certain pages. * * * Here's the page URL: http://www.example.com/en_GB/shop/details.cfm?R=PRODUCT23:en_GB Here's the first canonical tag, in the ``: It adds the `/category/` directory and the pages design looks different (colors are branded toward that category) but has the same content. And lower down, still in the ``, is the second canonical tag: Instead of the product page, this points to the general parent category of the US shop (instead of the UK one). * * * This only happens on a few of their hundreds of pages, which are typically product pages. Google reccomends putting the canonical in the ``, because Matt Cutts explains that it can be abused and injected in the `` maliciously. I assumed it was a mistake/bug of the CMS, since by the definition itself, there can be only one canon[ical]. But is there a valid reason for doing so?"} {"_id": "44654", "title": "Should initials of names be seperated by a dot or by a hyphen for seo purposes", "text": "I wanted to use names with initials in the URL. For example if j.k. Brothers is the initials and name then is it good to make my URL like so: sitename.com/j.k.brothers/listing/ Or would it be better to use hyphens like so: sitename.com/j-k-brothers/listing/"} {"_id": "17388", "title": "SEO effect of linking text out of context", "text": "I came across a curious situation recently. I was reading an article where some text was linked but such that the link text was completely out of context. Here is an example: > as I work my way through the ``best insurance`` fraud crimes > making the news The linked text, \"best insurance\" is linked completely out of context as the \"best\" describes the \"insurance fraud crimes\". Is there any SEO effect to this? Is there any indication that search engines attempt to analyse context of links as closely as required to detect this?"} {"_id": "44656", "title": "hidden_html - What is this for?", "text": "My host provides pre-made folders (`softaculous`, `public_html`, `public_ftp`) and one of them is `hidden_html`. I'm curious as to what this folder is for. I realize it's just a folder on the server and right now it's just housing the 2009 version of my site. Can this folder be used for anything special, or is it just there for looks?"} {"_id": "7217", "title": "Address bar showing long URL", "text": "I recently upgraded my hosting account to Deluxe where I can host multiple websites. I added a domain name and created a folder in the root directory giving it the same name as my domain name and uploaded my files. Now when I navigate the site the address bar shows: 'http://mywebsite.com/mywebsite/default.aspx' I want it to display: 'http://mywebsite.com/default.aspx' My thinking in creating folders that match the domain names is to keep them somewhat organized; never intended to have my domain names listed twice in the address bar."} {"_id": "12227", "title": "Some browsers zoom in on website, removing whitespace on sides?", "text": "On some browsers, they show the website more expanded, removing the white space on the sides of the page. This has the effect of making everything seem more cluttered and bigger. Is there something in my code that means they do this?"} {"_id": "53898", "title": "are literary works published in my and various websites considered duplicate?", "text": "I have published an ancient literary work which is not copyrighted or owned by any party. i can see other websites, news magazines sites published the same long back. The literary work consists of 1330 verses which cannot be modified and we don't want to modify. _It is of 133 chapters with 10 verses in a chapter. so websites categorize as 133 pages normally. each verse is exactly 7 words. and all websites will have these verse with optional extra information. so if compared with different websites which had published this we can see an exact copy of 20 lines with optional additional information below each verse of in some other form. When talking about the original source it is from text books published long before internet appeared._ Will that be considered duplicate? if so then how to emphasize that it is a literary work common for all (i mean anybody can use it) ?"} {"_id": "12225", "title": "Does the url path have to be same with the actualy categories we have setup?", "text": "Say for example we got an eshop, with a category \"car spare parts\" that includes a category called \"tires\" which has our product in there. We have created a menu on the left where the visitor clicks on \"car spare parts\" then clicks on sub-category \"tires\" and then he visits the tires product listing. The actual path of each product on the \"tire\" page, does it have to be eshop.com/car_spare_parts/tires or is it ok if the path is \"eshop.com/tires\". Will the second solution cause any problems to the eshop's SEO?"} {"_id": "12221", "title": "Launch of new website deleted google rank?", "text": "I think i might have slipped up here and would love some advice on this before i start making changes We have the domains www.mysite.com and www.mysite.co.uk Currently we have the main site on www.mysite.com , this runs a host of subdomains uand redirects for other sites, in asp.net I develop in php and as such thought it would be easiest to launch a new server and host the new site on www.mysite.co.uk and put a frame forward on the www.mysite.com domain. This would means no other changes needed, as i wasnt too sure about the DNS settings and certainly wouldnt want to play with the source code and server. This frame forward provided by 123-reg seems to have destroyed all page rankings though, cant find www.mysite.com in any searches Is there away to get the www.mysite.com ranking back up to the top again? Thanks for your advice"} {"_id": "63050", "title": "How to handle URL encoded parameter separators when responding to AJAX crawled _escaped_fragment?", "text": "I've been reading about how to handle Google requested URL here There is an example from Google : * Pretty URL: `www.example.com?myquery#!key1=value1&key2=value2` * Ugly URL: `www.example.com?myquery&_escaped_fragment_=key1=value1%26key2=value2` As you can see, in the ugly URL there is \"%26\" characters, as explained by Google that they need to be unescaped by the web server to obtain the original URL: %00..20 %23 %25..26 %2B %7F..FF As in the example, the \"%26\" (ugly URL) represents the \"&\" (pretty URL), so in this case, we need to replace \"%26\" with \"&\" to get the original URL. How do we know which character is represented by the special characters that Google replaced?"} {"_id": "54483", "title": "Google Analytics Not tracking data correctly IP-address issue?", "text": "I have developed a small site for a client and the site has been placed inside a ` I also checked out the addthis.com and it's a bunch of **Javascript**. I mean: **is there a way to just use something simple?** Example: FB like, click here I know in this way it won't show anymore \"how many people like\", but I don't mind this, I just want something simple, so I can use my own icon or text, etc. In my understanding for the Share button is possible by just adding: To share on FB click here. So i suppose it could be possible also for the Like button."} {"_id": "6819", "title": "is there a way to write .htaccess shorter?", "text": "RewriteRule ^([A-Za-z0-9_\\-]+)$ /userviewproducts.php?category=$1 [L] RewriteRule ^([A-Za-z0-9_\\-]+)/$ /userviewproducts.php?category=$1 [L] RewriteRule ^([A-Za-z0-9_\\-]+/[A-Za-z0-9_\\-]+)$ /userviewproducts.php?category=$1 [L] RewriteRule ^([A-Za-z0-9_\\-]+/[A-Za-z0-9_\\-]+)/$ /userviewproducts.php?category=$1 [L] RewriteRule ^([A-Za-z0-9_\\-]+/[A-Za-z0-9_\\-]+/[A-Za-z0-9_\\-]+)$ /viewbuyproduct.php?1=$1 [L] RewriteRule ^([A-Za-z0-9_\\-]+/[A-Za-z0-9_\\-]+/[A-Za-z0-9_\\-]+)/$ /viewbuyproduct.php?1=$1 [L] I have next rules. They work in that way: if url = /a or /a/ or /a/a or /a/a/ go to file userviewproducts.php and if url = /a/a/a or /a/a/a/ go to file viewbuyproduct.php It works as i need, but i see the CODE-SMELLS term here and want to write it shorter. Will plus every answer =)"} {"_id": "15709", "title": "Strange characters appearing on websites - ASCII? - UNICODE?", "text": "I have created many very simple pure HTML websites over the years. Most of them appear to work fine most of the time. But there is one recurring problem which I have never quite sorted out involving strange characters. The scenario goes like this: I create the site. I look at it in my browser, everything appears fine. I may look at it a great many times over the coming weeks or months as I make additions here and there. Perhaps on a variety of browsers on a variety of PC's. Then one day I look at the page and see a random sprinkling of white question marks against dark diamond shapes. These might appear where I had expected to see hyphens or quotes or apostrophes. My immediate thought is that my browser got into some strange state because I was looking at some foreign website with strange characters, but I'm never quite sure. I'm left with that nagging feeling that perhaps half the planet is seeing my website with funny question marks all over it. So my question is what's going on? What should I do to ensure that as many people as possible around the world can view my text as I originally intended? Should I be using those special html sequences like £ for all non alphanumeric characters? Should I worry at all? **Edit:** Right now I have the problem occurring on this page: http://www.fullreservebanking.com/papers.htm ... part of it looks like this: ![enter image description here](http://i.stack.imgur.com/felr7.jpg) I am using FireFox 5 and the character encoding currently appears to be \"UNICODE (UTF-8)\". I do not remember manually setting the character encoding to anything since installation. I do occasionally look at Japanese websites for work related reasons - though when I do so, I do not manually make any changes to firefox settings. **Edit:** Now fixed. Web page altered accordingly."} {"_id": "39632", "title": "How to find data usage of a user on my website?", "text": "I have a website (project) where users get logged in, do their work and then they log out. I need to build a report that displays how much each person has used of data. (bandwidth, how much was downloaded in Kb, etc) So the process may be like counting start of usage from user login to user logout. I have seen a little about Webalizer and AWStats for something like this, But I am not sure how they work. I have tried Content-Length but some pages don't send content-length.I have also seen mod_bandwidth but still I am little confused. This process is needed for my site because now, our company is thinking of charging per usage and also bandwidth allocation for each users (according to their membership). I haven't worked with this type of tools, I am newbie in this matter. I have done only simple websites not any setting like this in Apache or Linux. My project is in Codeigniter."} {"_id": "51845", "title": "Best way use a keyword rich domain to increase ranking of another website on that keyword", "text": "Consider the following scenario - we have an e-commerce enabled website selling (say) routers in Britain. Now, one of my keyword (of great importance) is \"British Routers\". My website is already on second page but not moving ahead for this keyword. A little bit of research revealed that the domain britishrouters.com was available which I bought. Now I want to make know what is the best possible way by which I can get my original website to move ahead on this keyword using this domain. Though I am no SEO person but I guess 301 redirects would already be covered by Google. My best possible bet is to probably do a wordpress install and write a few blogs having links to my original domain."} {"_id": "35652", "title": "What is duplicate content and how can I avoid being penalized for it on my site?", "text": "_This is a general, community wiki question regarding duplicate content._ _If your question was closed as a duplicate of this question and you feel that the information provided here does not provide a sufficient answer, please open a discussion onPro Webmasters Meta._ * * * 1. What does Google consider to be duplicate content? 2. Will the way I am presenting my content result in a duplicate content penalty? 3. How can I avoid having my site's content treated as duplicate content?"} {"_id": "16988", "title": "product in multiple categories - duplicate content problem", "text": "> **Possible Duplicate:** > Content appearing under multiple categories; anything I can do to prevent > duplicate penalty? What is the best practice to display a product that may be in several categories without producing duplicate content as far as search engines are concerned? In the past I have used a default category for each product so only one url is ever displayed for each product, but this is not a nice user experience if for example a product is listed in: maincategory A, subcategory B and also maincategory B, subcategory D and the user navigates through B > D to the product and the page url is displayed as website/A/B/product - the user may have lost the subcategory they came from initially. Is it really necessary to remove the duplicate url for SER purposes? Or will the SE pick the most appropriate one to list?"} {"_id": "57752", "title": "Categories as duplicated", "text": "Would Google treat different categories as duplicated? On my website, one content can have up-to 10 different categories. That makes every content appears in all those categories plus the main page."} {"_id": "35002", "title": "Can mass different log-in pages result in SEO duplicate and/or low quality punishments?", "text": "> **Possible Duplicate:** > What is duplicate content and how can I avoid being penalized for it on my > site? I have internal pages that rely on an external API which I would like to build upon user request. Two options I thought about: 1. Make lots of 'thin' pages that specifies that if you want content about X, you need to log-in, and then the page will be built. Pros: user understands what he'll get when logging in. Cons: SEO implications of such a solution due to the mass 'low quality' and 'cross-sites duplicate content' 2. Make them all redirect to ONE same generic log-in page. Pros: No duplicate low quality content. Cons: Lots of internal links to the same log-in page. Which would you recommend?"} {"_id": "28629", "title": "Could this be considered duplicate content?", "text": "> **Possible Duplicate:** > What is duplicate content and how can I avoid being penalized for it on my > site? I'm building a dating site at the moment. On the homepage I have a paginated list of all members on the site. Each page displays 10 members from newest to oldest. The pagination is simply \"<< Prev\" and \"Next >>\" links, so Googlebot should be able to crawl each and every member profile on the site. Now the thing is, these profile listings also appear on other pages. A listing block such as: ------------------------------------- Jill / 28 / Straight New York, NY Likes: long walks on the beach, vodka ------------------------------------- Will appear on the `Homepage ---> www.mysite.com/page/2` as well as on this category page `\"Women seeking Men\" ---> www.mysite.com/women-seeking-men/page/3` So that's two places where the same listing block appears. Is this considered duplicate content?"} {"_id": "31821", "title": "SEO with duplicate content", "text": "> **Possible Duplicate:** > What is duplicate content and how can I avoid being penalized for it on my > site? I have a nature photography site with multiple types of photo galleries. Each photo and associated caption on my site appears in several galleries. For instance, a photo of a goldfinch that was taken on a trip to New Mexico in 2008 will appear in the \"goldfinch.php\" gallery, in the \"finches.php\" gallery, and in the \"New_Mexico_2008.php\" gallery. This duplication is useful for my site visitors - User A may want to see goldfinch photos, whereas User B wants to see photos from New Mexico - but I am concerned about the SEO implications. The typical suggestions to deal with duplicate content, such as 301 redirects and canonical tags, probably won't work in this case, because the page content is substantially different (ranging from ~1% to ~90% duplication, depending on the specific example chosen). The obvious solution to me would be to edit robots.txt to only allow search engines to crawl one type of gallery - for instance, if they crawled only the galleries organized by species(e.g. goldfinch.php), all the photos on my site would be found exactly once. However, the Google content guidelines recommend against blocking crawler access to duplicate information. Should I go ahead and use robots.txt anyway? Or is there a better solution?"} {"_id": "65295", "title": "SEO - Listing Sidebar Code Before Content Code", "text": "On my website, I have always ordered my code with the code for the sidebar coming first, then the navigation bar, then the content section (with the individual paragraphs, images, etc... specific to each page), and lastly the footer. By order, I simply mean the code for my sidebar is first in the HTML document. The sidebar, navigation bar, and footer are all pretty much the same on all pages of the website. Is it bad that I am listing the duplicate content first? Should I be putting my content code first, because that is the part that is unique to each individual page? Does it matter what order code is listed in (especially for example if the navigation is absolute positioned and not within the flow of the document)? **Edit** This is specific not only about the order of HTML code but also the influence of duplicate content within the order of HTML. It may be that text that comes first in an HTML document is more heavily weighted, but does duplicate content apply to that as well."} {"_id": "16187", "title": "I have domain.com and domain.org to the same site, should I use redirects to avoid duplicate content", "text": "> **Possible Duplicate:** > What is duplicate content and how can I avoid being penalized for it on my > site? I have both the .com and the .org for a domain name, and using Apache I point them to the same site content. I think this might be causing problems with the Search Engines because of duplicate content. I want the .org to be the essential website. How do others handle this situation? Should I be using 301 redirects to point all the .com requests to the .org? Should I just use the link rel=\"canonical\" on each page to point to the .org?"} {"_id": "48715", "title": "What is duplicate content and what does it affect negatively?", "text": "I am fascinated by **SEO** and it's merits, so that's why I am going to optimalize my website. There are still a couple of questions that do pop up... Well, I am aware that duplicate content is not done, but I don't know what **duplicate content** is for `Google`... For example, I have a page which links to the top rated posts at the front page... If I click on another page, I do see the same link, but in a different context; not top rated, but just, a regular post... Question: is this duplicate? I have two links that are going to the same post, but is this seen as duplicate? On the other hand, I have a list of posts that I have ordered in a particular order. Now, I have three links that link to the same post. Besides, I have made an excerpt of the concept with a \" **read more** \" to the same post; I have 4 links to the same post... Is this duplicate content? If yes, what should I do? A `no-follow` on those links? On the other hand, I have **author pages** + **tag pages** in which the same posts pop out. Where do I put a no-follow and why? I would appreciate feedback and help."} {"_id": "31814", "title": "Does Google penalize pseudo-duplicate pages for different locations?", "text": "> **Possible Duplicate:** > What is duplicate content and how can I avoid being penalized for it on my > site? My compony's site's home page was not specificly optimized to any location. Now, I am planning to optimize it to Boston, and create ten or so other landing pages for other locations we serve. If we made these new pages by copying the original Boston one and changing the location's name (s/Boston/Montreal/), would Google consider them as duplicate pages and penalize us? What is the best practice for this?"} {"_id": "12269", "title": "How to tackle this duplicate content issue?", "text": "> **Possible Duplicate:** > What is duplicate content and how can I avoid being penalized for it on my > site? I've a new website coming up with 1000's of page which will contain some city codes. I'd also like to place city info copied from Wikipedia(as per Creative Commons License). Now the focus of the pages will be city codes. Wikipedia content will be there only to improve page appearance. However it can be slapped duplicate content penalty so I should use rel=canonical. But the main content/focus of site is some city codes and not Wikipedia info per se. Rel=canonical will give impression that whole part of page is duplicated lowering the page seo worth. **What should I do to tackle this duplicate content issue?**"} {"_id": "9690", "title": "SEO penalty for \"duplicate\" content when a site's also accessible via another domain name?", "text": "> **Possible Duplicate:** > What is duplicate content and how can I avoid being penalized for it on my > site? While testing searches for keywords on my site, I notice that a mirror of it at http://a8.8d.344a.static.theplanet.com/ sometimes appears at the top result rather than my primary domain. It looks like this is an alternative address for my server. Will the presence of identical content at this domain and at my primary domain result in a Google penalty? If so, what can I do about it? Thanks for any help..."} {"_id": "6269", "title": "Avoiding Duplicate Content Penalties on a Corporate/Franchise website", "text": "> **Possible Duplicate:** > What is duplicate content and how can I avoid being penalized for it on my > site? My question is really an extension of a previous question that was ported from stackoverflow and closed so I cannot edit it. The basic gist is a regional franchise company has decided to force all independent stores into one website look; they currently all have their own domains and completely different websites. After reading the helpful answers and looking over some links provided, I think my solution is to put a 301 on each franchise store site (acme-store1.com, acme-store2.com, etc) back to the main corporate site (acme.com). All of the company history, product info, etc (about 90% of the entire site) applies to all stores. However, each store should have some exclusive content such as staff, location pictures, exclusive events and promotions, etc. I originally thought that I would simply do something like acme.com/store1/staff, acme.com/store2/staff, etc for the store exclusive content and then acme.com/our-company, for example, would cover all stores. However, I now see two issues that I don't know how to solve. 1. They want to see site stats based on what store site they came from. If a user comes from acme-store1.com, is redirected to acme.com and hits several pages, don't I need to somehow keep that original site in the new url to track each page in that user's session and show they originally came from acme-store1.com? 2. Each store is still independently owned and is essentially still in competition with the other stores, albeit, in less competition than they are with other brands. This is important because each store would like THEIR contact info, links to their social media pages, their mailing list sign-up and customer requests on EVERY page. So if a user originally goes to acme-store1.com and is redirected to acme.com, it still should look to the user that it's all about store 1, even though 90% of the content will be exactly the same as it is in the store 2, store 3 and corporate site. For example, acme.com/our-company would have the same company history, same header/footer/navigation, BUT depending on the original site the user came from, it would display contact and links to THAT store. If someone came directly to the corporate site, it would display their contact and links (they have their own as well). I was considering that all redirects would be to store1.acme.com, store2.acme.com, etc (or acme.com/store1) and then I can dynamically add the contact info and appropriate links based on the subdomain or subfolder. But, then I have to worry about duplicate content penalties because, again, about 90% of the text in these \"subdomains\" are all the same. For reference, this is a PHP5 site. I've already written a compact framework utilizing templates and mod-rewrite that I've used for other sites. Is this an easy fix that I'm just not grasping? Any suggestions?"} {"_id": "8401", "title": "Duplicate content, point to original?", "text": "> **Possible Duplicate:** > What is duplicate content and how can I avoid being penalized for it on my > site? I am building a website for a client and would like to have some tabs on each product page with the instructions, postage info, ordering info in, the problem is that most of these tabs are universal and will be the same across more than one product. Will the site get penalised for duplicate content and is there a way to point to the original content in some way?"} {"_id": "38789", "title": "SEO problem for site with 2 domains", "text": "> **Possible Duplicate:** > What is duplicate content and how can I avoid being penalized for it on my > site? I have two domains pointing to the same site. I want both domains to co-exist, they share most of the same content, but they differ in design and they are aimed at different markets / rivaling communities. Is there a way to let google know that these two domains are the same site and don't cause me to get hit with a duplicate content penalty? Any other general SEO tips for this situation would also be welcomed. Thanks. Come on man, why was this closed. The linked page is completely irrelevant for me."} {"_id": "14284", "title": "Two URLs pointing to almost the same content", "text": "> **Possible Duplicate:** > What is duplicate content and how can I avoid being penalized for it on my > site? What I would like to do is the following: I have a website at `http://dev.upcoming-djs.com` which display a music chart. And displays a music player at the right side. I would like to make an URL like: `http://dev.upcoming-djs.com/track` Which points to the same page only with the track loaded in the player at the right side. If I understand correctly it isn't good (SEO-wise) to point different URLs to the same page (with same content). Is this true? And if so is the penalty severe?"} {"_id": "28760", "title": "Does duplicate content inside your own website matter for SEO?", "text": "> **Possible Duplicate:** > What is duplicate content and how can I avoid being penalized for it on my > site? I'm not referring to the duplicate content where you copy content from another site and publish it on your own. I am talking about original content that you produced. If on 10 pages on your site you have more or less the same content, would this affect your ranking with Google?"} {"_id": "14775", "title": "Duplicate content - linking to other versions of same page", "text": "> **Possible Duplicate:** > Two URLs pointing to almost the same content If I have a page, that has this url: * domain.com/category/something/ but can also be accessed through these urls: * domain.com/category/something/1 * domain.com/category/something/2 * domain.com/category/something/3 where the last number is used as an image-index to display a certain image on the page. So the content of the pages are the same, just with one image changed. How do I avoid google indexing the versions with /number at the end, so it only shows up once in searches - and is not considered duplicate content? Is it enough to put a rel=\"nofollow\" on links to the /number links?"} {"_id": "7935", "title": "Mobile website causing duplicate content issue on Google - how do we fix it?", "text": "> **Possible Duplicate:** > What is duplicate content and how can I avoid being penalized for it on my > site? My website exists at `www..com` for desktop browsers and `m..com` for mobile browsers. We are now seeing that for some Google searches, our mobile site outranks our main website; it doesn't seem to matter if the search is done from a desktop PC or a mobile phone. Aside from that not being the desired behavior, we're concerned that Google is seeing our mobile site as duplicate content of our main site. At this point we're considering an outright block of Google's crawler on the mobile site (via robots.txt). Is that the best approach? Is there a way to make our mobile site show up for mobile search, our main site to show up on desktop search, and avoid the present confusion? Thanks! -James"} {"_id": "13480", "title": "Duplicate content for sub domains i.e. example.com/subdomain/index.php seen the same as example.com/subdomain/", "text": "> **Possible Duplicate:** > What is duplicate content and how can I avoid being penalized for it on my > site? I have a website with some sub domain folders and seomoz is picking them up as both duplicate content and duplicate page titles. In actual fact, its the same page i.e. www.example.com/subdomain/index.php is seen the same as www.example.com/subdomain/ I want to point everything to the later. I'm sure this will be done in the htaccess file. I sorted it for the main site (home page) and everything works fine but when I tried copying the code and re jigging it for the subdomains, either nothing happened or the page went down."} {"_id": "17154", "title": "Same page in different \"locations\" on same site - duplicate content?", "text": "> **Possible Duplicate:** > What is duplicate content and how can I avoid being penalized for it on my > site? The same page, or rather the same content on a page is accessible from two very similar URLs, namely: * http://www.example.com/path/to/page.php * http://www.example.com/path/to/page.php?catIndex=1 The URL param in this case affects all the navigation links on the page. Basically, the same page is accessible via two separate navigation routes. To the end user, who doesn't look at the URL, they could be perceived as two separate pages in different parts of the site, although the content is identical. I have a `rel=\"canonical\"` link element linking to the first URL (without the URL param). Could a search engine perceive this as duplicate content? I think my eyes have gone square, but I was considering adding a robots \"noindex\" meta tag to the page when the `catIndex` URL param exists. But it is really the same URL and I do want the content indexed once, so I'm now thinking this would be foolish?!"} {"_id": "21885", "title": "Duplicated content in Google. Same subpage different link", "text": "> **Possible Duplicate:** > What is duplicate content and how can I avoid being penalized for it on my > site? At my website I can access published posts by two different types of links. For example http://mysite.com/posts/333 points at the same content as http://mysite.com/check/this-is-news,333 is . How Google search engine will deal with it? Is it example of duplicated content or not?"} {"_id": "56485", "title": "How much duplicate content percentage is considered as bad for SEO?", "text": "If a site contains 94% of unique content and remaining 6% of the content coming under duplicate, then can it be considered as bad for SEO?"} {"_id": "58980", "title": "Adding same text in each Wordpress blog post - SEO relevant?", "text": "I am adding a text below each blog post from my Wordpress PHP code. Since its always the same, I am wondering if thats counted as duplicate content by for example Google? What do you think? Thanks for letting me know :)."} {"_id": "60867", "title": "What happen If I post articles from other websites?", "text": "In my website I'm posting 5 articles everyday from different websites(sources), Will Google penalize for this behavior, if so how should I handle this ? I'm doing this for more than 2 years. Please suggest me a solution."} {"_id": "12057", "title": "Redirecting domains to prevent duplicate content", "text": "> **Possible Duplicate:** > What is duplicate content and how can I avoid being penalized for it on my > site? I have 2 domains: www.company.com and www.product.com I like both domains, they both point to the same site. I have all my canonical links set up properly, but do I have to pick one of the domains to be the master? Some of my visitors really like going to product.com/page.html And others like company.com/page.html But should they ALL 301 and redirect to company.com/page.html?"} {"_id": "4660", "title": "Duplicate subdomains and SEO", "text": "> **Possible Duplicate:** > What is duplicate content and how can I avoid being penalized for it on my > site? I have two subdomains for the same website. One has quite a high page rank, the other does not register. Is there anything that I need to think about in regards of SEO in this scenario? EDIT: Example: `blog.example.com` and `drupal.example.com` are two aliases for the same content. The first has a very high page rank, the second has none."} {"_id": "19032", "title": "Can duplicate content on a .com and a .co.uk site impact google ranking?", "text": "> **Possible Duplicate:** > What is duplicate content and how can I avoid being penalized for it on my > site? Can you please tell me if launching a .co.uk version of an English .com site can cause problems with ranking in Google SERPs? We have a .com eCommerce site with 100,000+ indexed pages. The site is many years old and has a page rank of 5 so it has a long history and lots of links. We launched a .co.uk version which used the same content for about 50,000 products, but uses pounds sterling and UK pricing. Within two weeks, we went from getting 2,000 visitors a day to our .com site down to about 60 visitors per day. We have been struggling with this issue for the past 30 days and it's costing us a lot of money. We had not considered the possibility that a .co.uk site could cause this big a problem. The main .com website is still well indexed and crawled, but seems to have been filtered out of search results. The .co.uk site shows up in the US searchers on Google.com and the .com site seems to be in the more results section. Could this be the problem we are dealing with? Could the .com site be suffering from a filter that shows the co.uk site instead. If so, do you have any suggestions on what we can do to remedy the situation."} {"_id": "3078", "title": "SEO: Duplicate content caused by pagination/tag/index pages", "text": "> **Possible Duplicate:** > What is duplicate content and how can I avoid being penalized for it on my > site? I read that I should use a `NoIndex` tag for transitional pages like index, pagination, or tag pages. Is this true? I have a Tumblr blog that I am considering putting `NoIndex` on the index, search, tag, pagination, and date pages. Is `NoIndex` enough or are there other methods? Should the index page of a site be marked as `NoFollow`? That doesn't really sound too good. What are the pages you would put `NoIndex` on?"} {"_id": "35567", "title": "Universal navigation menu across domains - would it be considered duplicate content?", "text": "> **Possible Duplicate:** > What is duplicate content and how can I avoid being penalized for it on my > site? Across different sites on different second-level domains exists a universal navigation bar with a collection of roughly 30 links. This universal bar is exactly the same for every page on each domain. The bar's HTML, CSS and JavaScript are all stored in a subfolder for each domain and the HTML is embedded upon serving the page and is not being injected on the client side. None of the links use any rel directives and are as vanilla as can be. My question is about Google's duplicate content rule. Would something like this be considered duplicate content? Matt Cutt's blog post about duplicate content mentions boilerplate repetition, but then he mentions lengthy legalese. Since the text in this universal bar is brief and uses common terms, I wonder if this same rule applies. If this is considered duplicate content, what would be a good way to correct the problem?"} {"_id": "34855", "title": "How to Keep SEO Score from Dropping with Duplicate Content", "text": "> **Possible Duplicate:** > What is duplicate content and how can I avoid being penalized for it on my > site? I'm hoping that someone has a solution for what I'm trying to accomplish. I'm working on a travel agency web site and there's a \"Overview\" section for each cruise line. These overviews are located on the index page for each cruise line. Here's my issue: The company is creating a search engine that includes details on each cruise line. Their write-ups on each cruise line are great, so I'd like to include the overview they created for each cruise line, rather than having to create all new ones. However, I don't want duplicating their content to negatively affect the SEO scores of the pages they originally put this content on. It's gong to duplicate, since each page that's dynamically generated by their search engine is going to include a section about the cruise line (where I'd want to place the overview). Question: Is there any way that I can include these overviews (ideally, copying the exact HTML that they've already implemented) without the search engines indexing those particular code sections? I'd want the rest of the search result pages to be indexed...just not the section of each page that contains this duplicate code. I saw something about using a span class named robots-nocontent in Yahoo (not sure if this also applies to Bing) and googleon / googleoff tags in Google. Is this the best solution? I'm open to any suggestions, thanks!"} {"_id": "25174", "title": "Are multiple results from a website search considered duplicate content?", "text": "> **Possible Duplicate:** > What is duplicate content and how can I avoid being penalized for it on my > site? The website is a Magento store , but I believe this applies to any product search. We have a search form which takes several parameters. Each combination of parameters results in a different set of products being returned. Obviously sometimes the same product will appear in a different set of products. Very little changes outside the products list when we filter the search: the title tag and the filters lists, the one major change is the product list, where some products will disappear, but most of the time they will repeat through different filters. Also since we have different urls the search engine is treating them as different pages. Is this considered duplicated content? Also what is the best way to make my search seo friendly? Listing all my products and using the canonical tag to point every search there is an option, but is this the right choice? Aren't I losing the possible keyword combinations from my filtres (category, price range, brand)?"} {"_id": "19522", "title": "How to avoid duplicate content penalties for \"multi-homed\" content?", "text": "> **Possible Duplicate:** > What is duplicate content and how can I avoid being penalized for it on my > site? My website is for U.S. military members, helping them enjoy their base more by providing places to go on/off base, and information to learn about the base. It is capable of supporting every military base world-wide. Each base has it's own landing page. From here users can see content like: place pages, \"blog\" posts, etc... Here's where my question comes in. Places can be viewed by bases within a given distance of that place, so some places will appear on multiple base pages. Blog posts are tiered. They can be global (appear on all bases), regional (appear in all U.S. bases, or overseas), sub-regional (appear in Japan, or Germany, etc...), or specifically for one base. **How do I avoid duplicate content penalties for places/posts that span more than one base?** For example: here is the same post on one base, and on another. The url is different, but the content is the same, but it's \"legitimately duplicated\"."} {"_id": "53919", "title": "Statics Banner Content Consider as Duplicate?", "text": "One Quarry!! Suppose if I have site having 10 pages (with 3 different categories) and all pages have one static banner which is running in all pages. Now let say it 3 categories be A, B , C defining 3 services and having few content on the banner representing the service then condition of duplicate content will arise or not??"} {"_id": "26974", "title": "Part of content is available in another page. Does that qualify as duplicate?", "text": "> **Possible Duplicate:** > What is duplicate content and how can I avoid being penalized for it on my > site? I know that duplicate content needs to be addressed in some ways, but this problem is bugging me for some time I have no clue how to solve it. I have a page say a product page with URL **/product/1234/name-of-product** that part of it is displaying a stream of news related to this product with a \"more\" button at the end, like Facebook stream. This more button when clicked is intercepted by javascript code which will fetch the rest of stream through an AJAX request to URL **/stream/product?keyword=name-of-product**. However if the JS is disabled or for some reason the AJAX cannot be made the link works the normal way which will go to the URL above which will show only the stream. Now I don't know how to tell search engines that this PART of the page is the same as the stream URL. The thing is I don't want the stream URL to rank higher than main product page. Is it right to put a _rel=canonical_ on the stream page that point to product page? If true the product page is only showing a portion of stream and for more it links back to stream page again. I'm confused a little here. Thanks in advance and I'll provide further information if required to clear this up."} {"_id": "55596", "title": "Why Wikipedia does not have duplicate content issues with search engines?", "text": "I discovered that the following 2 URLs are serving exactly same content. Why is this not impacting Wikipedia's SEO? I've heard having same content across 2 URLs has negative impact on SEO. http://en.wikipedia.org/wiki/Data_mining http://en.wikipedia.org/w/index.php?title=Data_mining Disclaimer: I'm a developer, not in SEO or marketing so don't know much about SEO."} {"_id": "15879", "title": "How to send hundreds of emails an hour with content personalized to each user?", "text": "There are more than 10 categories in my site, users can register more than one category. My PHP script prepares content for each category, according to user preferences. My script merges those contents for each user, so every user can get personalized content from the categories they want. The problem is, I want to send 340+ mails per hour and DreamHost doesn't allow it. I think to a service like MailChimp but I couldn't find that scenario. Do they support personalized category & content? Can I use SMTP in DreamHost?"} {"_id": "15700", "title": "CMS for single user-editable pages?", "text": "Does anybody know of a CMS where users can edit their own page, and their page _only_ (something similar to about.me, except with more customization/options)? I'm not talking about profiles, but more like an individual web page for people's businesses. I want to be able to give local businesses the opportunity to make a single web page for their businesses with ease. I have looked at many CMS's, but I can't find anything that offers this type of functionality. I've check out the following: * Unify * Concrete 5 * Drupal * Simple CMS * CMS Made Simple (and more) If anybody knows a CMS with the functionality that I'm looking for, or even a regular CMS with modules/plugins that I would be able to use, that would be awesome. Also: the cheaper, the better :D Thanks, Gerard"} {"_id": "6810", "title": "Domain Mapping with Wordpress.com", "text": "My girlfriend has set up a blog/website for herself on wordpress.com but also has a domain name she'd like to use when giving out her website URL, instead of a wordpress.com subdomain. Wordpress has some info about that here: http://en.support.wordpress.com/domain-mapping/map-existing-domain/ I'm interesting in knowing more about the pros and cons of handing over a domain to Wordpress like this. Has anyone done it?"} {"_id": "6813", "title": "Good to optimize for broad but less-competitive keyword?", "text": "I have a Super Mario Bros website and business, Mario Planet, and I am looking for a target keyword for generating traffic - and therefore revenue - to the site. Using the Google Keyword Suggestion Tool (quite a mouthful!), I found that the keyword `mario` generates: > Global Monthly Searches: 68,000,000 > Local Monthly Searches: 7,480,000 That's quite amazing. Now, to my intense surprise, the competition - as indicated by Google's little bar graphic - is very _low!_ I'm surprised about this, unsettled even. Now, I think I have to optimize for this keyword, just because, I would assume - out of that many people - that a percentage will be willing to purchase toys. And since the competition is so low, I doubt it would take a while to optimize for it. Any suggestions? Thanks!"} {"_id": "15703", "title": "Do you need to crawl the whole internet to find backlinks of a URL?", "text": "Say I want to retrieve all the sites on the web that have a specific link on them. For example I want to know all the backlinks made to my blog, on other websites. There are services out there that do this: http://www.backlinkwatch.com/index.php - was wondering how they achieve this functionality. Is crawling the entire internet the only option or are there _third-party_ ways of doing this, say using Google."} {"_id": "15873", "title": "Product application - is it a product or product variation", "text": "I'm dealing with a lot of vehicle specific products, and I've been trying to determine whether to convert the variants/fit option into individual products. I currently put the vehicle specific items under a product: Product: Widget Hood Deflectors Option1: 07-11 Silverado/Sierra, SKU1 Option2: 09-11 Ram, SKU2 etc. Take a hood/bug deflector for example. They all share the same description, and specifications for the most part. They look very similar, but the shape/appearance could vary significantly depending on the vehicle it is going on. Another example could be a suspension lift kit. Each one is engineered for a specific vehicle application. What would be the product \"Widget Super Duper 4 inch lift kit\", or \"Widget Jeep 07-11 Super Duper 4 inch lift kit\"? If I converted the variants to a product, then I have _a lot_ more products (some so called products or product lines have hundreds of applications), when no vehicle is selected, but if I require a vehicle to be selected, then the product results would be basically the same, and specific for that vehicle. The description would also be longer: Product: Widget Silverado/Sierra 07-11 Hood Deflector With the fit as a variants/option, then I have fewer products, but I could have a huge list of options. Product: Widget Hood Deflectors Options: Fit/Vehicle Am I doing things right by having product applications as variants, or am I treating a product line as a product?"} {"_id": "6816", "title": "Howto use .htaccess with exclusions", "text": "I want all requests on a domain to be redirected to https, with the exception of just one particular file. I'm not sure how to accomplish that with .htaccess -bash-3.2# cat .htaccess ErrorDocument 404 http:// www.domain.com RewriteEngine On RewriteCond %{SERVER_PORT} 80 RewriteRule ^(.*)$ https:// www.domain.com/$1 [R,L] -bash-3.2# The above code redirects everything perfectly, however I need the robots.txt file to be accessible via http:// vs the https:// only"} {"_id": "9757", "title": ".htaccess rules to rewrite URLs to front end page?", "text": "I am adding a new application to my site at example.com/app. I want views at that URL to always open myapp.php. E.g. example.com/app -> example.com/app/myapp.php and example.com/app/ -> example.com/app/myapp.php **What's the correct form of rewrite rules in the .htaccess file?** I've got: RewriteEngine On RewriteBase /app/ RewriteRule ^myapp\\.php$ - [L] RewriteRule ^myapp.php$ - [L] RewriteRule . - [L] ...based on what the Wordpress front-end does. But all I see at example.com/app is a directory of files. :( (I put those rewrites at the top of my .htaccess file). Any ideas? **Update** What actually worked: RewriteEngine On RewriteBase / RewriteCond %{REQUEST_URI} ^/app(/.*)?$ [NC] RewriteCond %{REQUEST_FILENAME} !-f RewriteRule . /app/myapp.php [L] This is good because: 1. Explicit or implicit calls to app/myapp.php work. 2. example.com/app redirects to app/myapp.php 3. example.com/app/ redirects to app/myapp.php 4. example.com/app/subfunction redirects to app/myapp.php 5. All other calls to example.com/otherstuff are untouched. Item 4 is Wordpress-like Front Controller pattern behaviour. I think that rule RewriteCond %{REQUEST_URI} ^/app.*$ [NC] needs refining as it allows /app-oh-my-goodness etc. through too. Thanks for the answers."} {"_id": "29370", "title": "Allowing minors to play online browser game", "text": "What are some possible legal issues with allowing minors to play in a text- based browser game. The issues I know of are laws against tracking minors (logging IPs at sign in) and laws against providing pornography (players uploading their own avatars). I'm making a text-based browser MMO. Players can message each other and see each others' avatars. I'll be logging IPs of players. I see some MMOs ban minors from playing, per TOS. I don't want to do that. And I don't want a TOS provision I don't plan on enforcing. I'm in the US. I'll take any advice or useful links you guys got."} {"_id": "9754", "title": "Math to create web design grids?", "text": "I have my own grid layout, and I am creating multiple column designs for a website. Is there any formula I can follow to create different column designs with identical gutter space, ie. margins? Most are the alike, (through trial and error), however some are a pixel off and I am looking to make it pixel perfect."} {"_id": "9753", "title": "Google search does not show sub-pages from my website", "text": "My website appears in Google search, but only the first page. Of course I have sub-pages linked from the first page, but the sub-pages do not show in Google search. Not in Yahoo, not in Bing. What should I do? It has been three years that sub-pages do not show. (I tried searching site:mydomain.com and pressed 'repeat the search with the omitted results included' link) What would you suspect the reason? My website addresses were like xxx.php?yy=zzz etc, etc, so I changed it to /yy/zzz using mod_rewrite. I thought it might be (X)HTML standard violations, so now I changed it. I hope Google will soon have my entire website, but I am a little bit pessimistic. Do you have any thought?"} {"_id": "29377", "title": "Google Adsense and Private Networks", "text": "As the title suggests, I'm trying to figure out if I can use Google Adsense on a private network. The network is accessible only by registered users, which made me think if it would work or not. Thanks in advance!"} {"_id": "29375", "title": "Using another domain with Google App Engine", "text": "I'm trying to change my google app engine domain (domain.appspot.com) to the domain I bought from 1&1.com (mydomain.com). I went into the google app engine settings and added the domain. After making a Google Apps account, I was asked to verify my domain. The directions say that 1&1 doesn't allow me to create TXT records, so I can't use that method for verification. Their alternative is to upload an HTML file to my server, but I didn't buy hosting with my domain, I just bought the domain. My files are on domain.appspot.com. How can I make mydomain.com point to domain.appspot.com? I've added the ns1.googleghs.com as my nameservers in my 1&1 DNS settings, but I still can't verify my domain with Google Apps."} {"_id": "10576", "title": "What is the single most influential book about website promotion and internet marketing?", "text": "What is the single most influential, useful, famous book about website promotion and internet marketing?"} {"_id": "20119", "title": "I want to test major changes on my website without hurting SEO", "text": "I want to replace all my website files for new ones on the server and to test if they work alright but I don't want to be crawled while this is happening or if I am, I don't want the crawler to be able to see what's in the new page. I just want to change it for like 15min but i tested other stuff once for 5min and google for webmasters told me they had a lot of 404's during that period and I don't want that to happen again. Besides that I don't want to have to write my whole .htaccess file. I considered redirecting all the urls to my main page and apply the changes but I think this isn't probably a good practice, any good ideas?"} {"_id": "30947", "title": "Warning: mysql_fetch_array() expects parameter 1 to be resource", "text": "I was trying to connect to my database through PHP but I keep getting the error: > Warning: mysql_fetch_array() expects parameter 1 to be resource I do not know what the problem is? This is the code I have: Can someone tell me what is wrong?"} {"_id": "10572", "title": "Hide email adress with JavaScript", "text": "I read somewhere that hiding email address behind JavaScript code, could reduce spam bots harvesting the email address. This will not display the actual email-address in the sourcecode of the page, but it will display and work like a normal link for human users. Is it any point of doing this? Will it reduce spam bots, or is it just nonsense that might slow down performance of the page because of the JavaScript?"} {"_id": "10573", "title": "Are browsers supposed to automatically retrieve intermediate SSL certificates and do they?", "text": "I have recently encountered the following issue: a user couldn't access a website that was using an intermediate VeriSign certificate because it is no longer included in Firefox. Some sources claim that Firefox will import the certificate from websites that send it, others say that it won't. Will it? If not, is it supposed to?"} {"_id": "244", "title": "Tools to check for common vulnerabilities?", "text": "Are there any good tools (desktop or online) which allow you to check whether your website has common vulnerabilities (e.g. SQL Injection, XSS)?"} {"_id": "25999", "title": "Webiste Testing for SQL and CSS injection", "text": "> **Possible Duplicate:** > Tools to check for common vulnerabilities? I would like to ask some tools or code for testing my new created website from **_SQL or CSS_** injection. The website in created in PHP and has login module which is use for ticketing system for support module. Url convention is **www.domainname.com/index.php** Thanks in advance."} {"_id": "18936", "title": "Encouraging recurrent users to +1 (or Facebook Like)", "text": "I want to display a message to users who have visited my site 5 times, but have not yet +1'ed or Like'd to please do so. I guess javascript and a cookie are likely required. But have no idea how to go about doing this. Any ideas?"} {"_id": "53577", "title": "How to create a redirect page in Google Sites?", "text": "How to create a redirect page in Google Sites? Example: Home \u2192 Page \u2192 _redirect_"} {"_id": "24142", "title": "Image hotlinking providers?", "text": "I use a lot of images in my wordpress and due to hosting restrictions I need to host the images somewhere else and hotlink them in my blog posts. So I am looking for some reliable image host which provides free hotlinking service. The Google Picasa would be best, but I think they do not allow hotlinking. PS. I'm not looking for hosts like tinypic or imgshack, I'm looking for some websites which provides powerful features to oranize images (eg. albums etc)."} {"_id": "15380", "title": "Ideas to improve website loading speed?", "text": "Are there any ways in which I can improve the loading speed of a website? I know that Google is really pushing loading speeds and with the popularity of mobile websites it's now imperative that sites load quickly."} {"_id": "41139", "title": "What is optimum time spent downloading a page?", "text": "> **Possible Duplicate:** > Ideas to improve website loading speed? I saw on wpengine.com, > Google says they lose 20% of their traffic for each additional 100 > milliseconds it takes a page to load. Speed matters. Google also > incorporates your page-load time into your site\u2019s search rankings. Faster > sites win, literally. That\u2019s why WP Engine custom-built our EverCache > technology to deliver WordPress fast enough for Google, and at scale. The average download time of my site is 1216 ms. I think this value is too much. How can I reduce this time? Currently I am using shared hosting. If I switch to vps server, will this value reduce?"} {"_id": "56090", "title": "cloudflare for joomla website and how to make it more optimized", "text": "In our office somebody told to use cloudflare for better website browsing at client end.!! I've put our company's website on cloudflare, now.. what other changes should i make to make my website faster accessible by client end. The developer before me made our company's website in Joomla.. I'm not a good developer in Joomla either. Is joomla itself a faster kind of stuff producer??? And what other stuff should i follow to make my joomla website faster?"} {"_id": "67353", "title": "Really slow wordpress website admin panel with a total query time 1664ms for the home page", "text": "I have a problem with my WordPress website being too slow, I installed a debug bar to check queries in the home page, it reports 202 queries, and a total query time equal to 1665ms. ![enter image description here](http://i.stack.imgur.com/47o3S.png) The cpu and the ram are in critical use. ![enter image description here](http://i.stack.imgur.com/SAev3.png) Here is what i tried to do: * Optimize pluginDB to defrag the DB. * ThemeCheck plugin to analyse and clean my template. * W3C Total cache Plugin with object cachr enabled. Necessary plugins are: * WordPress SEO by YOAST * Jetpack service by WordPress.com * Contact form 7 Thank you"} {"_id": "21420", "title": "403 error on index file", "text": "When I try to access index.py in my server root through http://domain/, I get a 403 Forbidden error, but when I can access it through http://domain/index.py. In my server logs it says \"Options ExecCGI is off in this directory: /var/www/index.py\". However, my httpd.conf entry for that directory is the same as the ones for other directories, and getting to index.py works fine. My permissions are set to 755 for index.py. I also tried making a php file and naming it index.php, and it works from both domain/ and domain/index.php. Here is my httpd.conf entry: Options Indexes Includes FollowSymLinks MultiViews AllowOverride All Order allow,deny Allow from all AddHandler cgi-script .cgi AddHandler cgi-script .pl AddHandler cgi-script .py Options +ExecCGI DirectoryIndex index.html index.php index.py "} {"_id": "24141", "title": "Why is there a form which allows us to redirect our naked domain, yet it does nothing?", "text": "In Google Apps Domain Management, There is a page for me to change how my naked domain `http://example.com` is redirected. However, in that page, it tells me **You must change the A-record at your domain host for this change to take effect**. Now if I have to change the A-records in my domain host manually, why do I still need to access this page for? If I change the A-records in my domain host manually, everything will work fine just as a _usual non-Google domain setup process_ isn't it? So what's the purpose of that form at all? ![enter image description here](http://i.stack.imgur.com/MzlXQ.gif)"} {"_id": "63273", "title": "How to copy or replicate a complex website to local file and modify then", "text": "I am not good at designing the visual side of a website. I found a website which I gave 10 over 10 because its functionality suits my aims and also it seems very esthetical. I know HTML, PHP, mySQL and some degree of CSS. I don't know JS, Ajax, Jquery. So I want to replicate this web site (save completely) on my local and then modify it. (content, colors, icons etc.) I saved this web site in Chrome and IE. After clicking the site from my local folder, a saw an ugly & non-working site. My aim is to understand the functions of the parts that I don't know. For example when I delete a js in its page what will happen as the result of the deletion operation. Since the page is too complex it has lots of css, js files to download inside. I don't want to deal it manually. Is there any alternative and easy way to get the web page completely to my local which also works like a charm from local? regards"} {"_id": "58525", "title": "What is the effective number of keywords in a domain name for SEO?", "text": "I want to create domain names that target two keywords but I have to add an extra word (or random string?) to get to domains that are still free. Will adding more keywords to a domain name reduce the SEO value of the other two? How does Google divide value for keywords in the domain name?"} {"_id": "59122", "title": "Redirecting a Preview DNS subdomain to a domain with htaccess", "text": "I want to redirect all the WordPress urls of the form `http://example.com.previewdns.com/my-favorite-post/` to `http://example.com/my-favorite-post/` I am trying to do this with the below .htaccess code.. But its not working RewriteEngine On RewriteBase / RewriteCond %{HTTP_HOST} !^example.com.previewdns.com$ [NC] RewriteRule ^(.*)$ http://example.com/$1 [L,R=301] Please help"} {"_id": "15393", "title": "Is a URL with a query string better or worse for SEO then one without one?", "text": "I want to know, is there a huge difference in terms of SEO between these URLs: > mysite.com/ontario/toronto/listings > > or > > mysite.com/listings.php?p=ontario&c=toronto Will one URL rank higher than another? Is there a huge difference here?"} {"_id": "38628", "title": "How important are SEO Friendly URLs", "text": "> **Possible Duplicate:** > Is a URL with a query string better or worse for SEO then one without one? Currently, my URLs look something like `http://mydomain.ext/question/5` where `question` is the Controller and `5` is the ID of the object or article retrieved. In theory I could spend some development time and some server resources to have URLs that would contain more information about the page loaded. However, seeing how websites like Youtube or many others just keep simple URLs with just an ID, I am asking, does it matter? It is worth it??"} {"_id": "28421", "title": "Are urls with ?page=something bad for SEO?", "text": "> **Possible Duplicate:** > Is a URL with a query string better or worse for SEO then one without one? I have a listing of members on my homepage. My dynamic urls are quite simple: http://mysite.com/members?page=8 I just use the query parameter to go from page to page. Is this bad for SEO? Should I be doing something like this instead?: http://mysite.com/members/page/8"} {"_id": "35595", "title": "parameters in a seo url", "text": "> **Possible Duplicate:** > Is a URL with a query string better or worse for SEO then one without one? This should be a very simple question for seo experts. Let's say we have the following URLs: * http://www.test.com/some-sort-of-page * http://www.test.com/some-sort-of-page?pgid1189 * http://www.test.com/some-sort-of-page/page/1189 * http://www.test.com/page/1189/some-sort-of-page The first one is an ideal solution. What i need to do is to somehow pass a resource identifier in the url to know exactly what this url is pointing to, since it can be pointing to a lot of different things. In the second URL, \"pgid\" specifies that the resource is a \"page\". URLs 3 and 4 specify the same thing differently. I do not care if the URL is friendly to people, because, let's face it - 99.9% of people will never ever ever bother to remember such url no matter how \"friendly\" it is. So the question is: which of the last 3 URLs would be the best solution for search engines? My guess is it would be the 2nd with query string, but i might be wrong. Thanks for your thoughts P.S. please don't offer using the first url. There's no problem using it, but the question is not about that."} {"_id": "65423", "title": "Is using URL parameters bad for SEO?", "text": "I recently noticed that my web page have problems with Google and good search results. Many of the existing SEO tools suggests that my website uses bad URLs for SEO. I'm using a single PHP file that handles all the sections via parameters. By example: * www.alanmarth.com/index.php (Main Page) * www.alanmarth.com/index.php?seccion=servicios (Services) * www.alanmarth.com/index.php?seccion=blog (Recent news) * www.alanmarth.com/index.php?seccion=blog&cat=2 (News category) * www.alanmarth.com/index.php?seccion=blog&id=3 (A single entry) Is this ok? If it isn't, how can I solve it without having to rewrite my entire site?"} {"_id": "49389", "title": "What are the pro and cons of using friendly URLs compared to query parameters (and vice versa)?", "text": "I am noticing that more and more sites prefer using \"friendly URLs\" over adding information as query parameters. For example: domain.com/product/3343/ compared to domain.com?page=product&id=3343 Disregarding personal flavor, what are the core differences between the two approaches?"} {"_id": "67202", "title": "Best practice URL structure for pagination", "text": "Is one of these formats for pagination better for SEO? * www.example.com/list/1 * www.example.com/list?page=1 What considerations or factors should go into selecting one format or the other? I'm relatively new to this and don't want to make the wrong call."} {"_id": "63277", "title": "Apache mod_pagespeed ignores redirected pages", "text": "Using mod_pagespeed (https://developers.google.com/speed/pagespeed/module) with an Apache Server, I noticed that some pages were not being processed. The pages in question all had one similar attribute - they were \"redirected\" pages - for example an ErrorDocument response or an \"index.html\" file serving as the response when a directory is requested. Is there any way to remedy this? I've checked the FAQ and had a good trawl through the PageSpeed Documentation but to no avail."} {"_id": "8848", "title": "Share links with Because they are sending their `hostname` the data is filterable in custom reports but it still turns up in the _standard reporting_. Is there some global account setting to reject data from certain domains? I can't believe I'm the first GA user to have somebody accidentally spamming their dataset. I hope the answer is not to declare GA account bankruptcy and move on to a clean slate."} {"_id": "57418", "title": "How should I set up and use a CDN for my web application?", "text": "I plan to create and run my own CDN. Currently, media and other static content is being served by the web-servers which are behind a load balancer which is a waste of resources. The network structure of my web application (with the CDN) will be as follows: * 1 load balancer node for the web servers (nginx) * 3 web server nodes (apache, php) * 2 database nodes (mysql) * 1 load balancer node for the CDN (nginx) * 3 CDN storage nodes (apache, php) How could I transfer files to the CDN from the nodes which generate media? For example image processing after file uploads."} {"_id": "9509", "title": "Need some help on tomcat URL mod_rewrite or mod_jk", "text": "I am trying to remove the context name from the URL of my server. Current URL - `http://www.domainname.com/MyApp/` What I need to make is to make it available at - `www.domainname.com/` So it is only going to host one main app and that needs to be displayed when we open `www.domainname.com/` on browser. I have already tried couple of things like below - RewriteEngine On RewriteCond %{REQUEST_URI} !^/(Context/.*)$ RewriteRule ^/(.*)$ /Context/$1 [P,L] OR redirect permanent /MyApp/ abcd://domainname.com OR Using JKMount - JkMount /MyApp/* ajp13 JkMount /MyApp* ajp13 OR Deploy war file to ROOT of tomcat and make relevant chagnes in web and server.xml All of these aren't working and I keep getting a intenal error. I need a way to basically trim the tomcat URL to make short."} {"_id": "63045", "title": "How to point subdomain of A to folder of domain B", "text": "I have a subdomain, say `sub.A.com` I would like this to be pointed to `B.com/folder` With `A.com` I am only able to change the DNS records since it's hosted on an eCommerce platform. With `B.com` I can change any parameter since I host it. A simple `301` redirect from `sub.A.com` won't do because I don't want people to see `B.com` when they go to `sub.A.com`."} {"_id": "57412", "title": "Is it acceptable to use conditional comments to block ie rather than ua sniffing?", "text": "I just cant see a way to support IE8 and below. Having said that, I don't want to sniff the user agent because its bad practice and feature detection is a better alternative. Would it be acceptable to wrap the entire content of my page in a conditional comment for IE9 and above therefore blocking IE legitimetelly? Or should I reconsider my plan?"} {"_id": "57413", "title": "How does Google Analytics know the sex, age, interests of visitors?", "text": "The demographics pane displays stats based on sex, age and interests. How did it learn this private information? ![screenshot](http://i.stack.imgur.com/j8g7O.png)"} {"_id": "9501", "title": "Apache mod_rewrite for multiple domains to SSL", "text": "I'm running a web service that will allow people to create their own \"instances\" of my application, running under their own domain. These people will create an A record to forward a subdomain of their main domain to my server. The problem is that my server runs everything under SSL. So in my configuration for port 80, I have the following: ServerName mydomain.com ServerAlias www.mydomain.com RewriteEngine On RewriteCond %{HTTPS} !=on RewriteRule /(.*) https://mydomain.com/$1 [R=301] This has worked well to forward all requests from the http: to https: domain. But as I said, I now need to let any domain automatically forward to the secure version of itself. Is there a rewrite rule that will let me take the incoming domain and rewrite it to the https version of same? So that the following matches would occur: http://some.otherdomain.com -> https://some.otherdomain.com http://evenanotherdomain.com -> https://evenanotherdomain.com Thanks for your help! Apache mod_rewrite makes my brain hurt. Aaron."} {"_id": "9506", "title": "Virtual hosting", "text": "I want to use domains like xxx.abc.domain.tld. The xxx is my folder to access. I tried it with the rewrite rules, but I can't get it working, because I don't know how to get the part xxx from the SERVER_NAME into my RewriteRule. This was my try: UseCanonicalName Off # include the IP address in the logs so they may be split LogFormat \"%A %h %l %u %t \\\"%r\\\" %s %b\" vcommon CustomLog /var/log/apache2/vaccess.log vcommon RewriteEngine On # a ServerName derived from a Host: header may be any case at all RewriteMap lowercase int:tolower ## deal with normal documents first: # do the magic RewriteCond ${lowercase:%{SERVER_NAME}} ^.+\\.abc\\.domain\\.tld$ RewriteRule ^(.*)$ /var/www/abc.domain.tld/[xxx-part]/$1 [L] Perhaps there is a better solution. In generally I want to create a dynamic login system with mod_auth_mysql and for each xxx is a seperate user database. I would prefer the domain/address syntax abc.domain.tld/xxx, but I don't know how to realize it. Thanks for any advices."} {"_id": "9504", "title": "RewriteRules targeting a directory result in a gratuitous redirect", "text": "I have a standard CMS-like RewriteRule set up in my .htaccess: RewriteRule ^(.+)$ index.php?slug=$1 Let's say I have a directory called \"foo\" in the root directory. For some reason, if you hit the page it causes a redirect: http://www.mysite.com/foo --> http://www.mysite.com/foo?slug=foo Removing the directory fixes the problem, but unfortunately, it's not an option. Does anyone know of a workaround?"} {"_id": "57415", "title": "Can I contact the owner of a domain name based on a WHOIS search?", "text": "I did a WHOIS search for a domain name that I want. I found some info about the registrant of the domain name, and while no useful direct contact info was given, there was enough there for me to do some searching on the web and find some other ways to contact the registrant. I would rather contact him on my own than pay $39 to make an offer, which is the option this website gives me, and I don't even know if it will be seen. But I also saw this on the page that shows the info about the registrant of the domain: > By submitting a WHOIS query, you agree that you will use this data only for > lawful purposes and that, under no circumstances will you use this data to: > (1) allow, enable, or otherwise support the transmission of mass > unsolicited, commercial advertising or solicitations via direct mail, > electronic mail, or by telephone; or (2) enable high volume, automated, > electronic processes that apply to [name of website] (or its systems). It seems like it just means that I can't do mass marketing using info obtained from these searches, but it wouldn't disallow me from contacting the registrant on my own. However, the phrase: `mass unsolicited, commercial advertising or solicitations via direct mail, electronic mail, or by telephone` makes me a bit uneasy...the \"or\" in there seems to imply that even solicitation that's not \"mass solicitation\" is disallowed."} {"_id": "49099", "title": "Which pages from the sitemap are not indexed by Google?", "text": "I have a sitemap file that has been submitted to Google, and has been processed, according to Google Webmaster Tools. As per GWT for 2013-05-31, 8255 web-pages are submitted, but only 8209 are indexed. Which 46 of the 8255 pages are not indexed?"} {"_id": "65522", "title": "apache is not allowing me to use port number 443 in virtual tag for configuring apache for https service", "text": "I am trying to configure apache for https service in xampp , i am almost done i am jsut struggling with that virtual tag in which i have give information about my self-signed ssl-certificate and key file , apache is not allowing me to use port number 443 in virtual tag even though i have no other service is running on this port , instead its working very well with any other free port like 55000 # Server Certificate: SSLCertificateFile \"C:/xampp/apache/bin/mycert.crt\" # Server Private Key SSLCertificateKeyFile \"C:/xampp/apache/bin/mykey1.key\" SSLEngine On ServerName localhost DocumentRoot \"C:/xampp/htdocs\" error : Error: Apache shutdown unexpectedly. 3:36:52 AM [Apache] This may be due to a blocked port, missing dependencies, 3:36:52 AM [Apache] improper privileges, a crash, or a shutdown by another method. 3:36:52 AM [Apache] Press the Logs button to view error logs and check 3:36:52 AM [Apache] the Windows Event Viewer for more clues 3:36:52 AM [Apache] If you need more help, copy and post this 3:36:52 AM [Apache] entire log window on the forums"} {"_id": "21638", "title": "How can you prevent someone destroying your app if they get into your account on heroku?", "text": "Heroku is wonderful but it's slightly concerning that if your Heroku login is compromised someone can simply destroy your entire app and business. Is there any way of preventing this using multi factor authentication or similar?"} {"_id": "32604", "title": "Visit without pageview in Google Analytics", "text": "Today I saw some metrics on one of my websites in Google Analytics and found something strange to me. How can I get visits without pageviews? I looked at Audience>Engagement>Page Depth I have page depth<1 and 710 visits Is there a problem in these numbers?"} {"_id": "53436", "title": "How to show the search box in Google Search results page?", "text": "I need to know the proper way to propagate the search-box in the **Google search result page**. I tried searching the Google for the procedure, but no luck. Any idea how to make appear the search-box ? ![enter image description here](http://i.stack.imgur.com/sVUdI.jpg)"} {"_id": "49090", "title": ".htaccess Directory Redirect Problem", "text": "I want to redirect a directory to another directory but I've hit a snag. I want to redirect /directory_name to /new_directory_name so I put this in my .htaccess: RedirectMatch 301 /directory_name/(.*) /new_directory_name/$1 Which worked fine BUT I also have another directory called : /images/directory_name/ And the redirect is redirecting links to files in that directory to /new_directory_name/ and resulting in broken image links. So I need a redirect that ONLY redirects /directory_name/ and not any other directories that happen to have the same name (but are in a different location)."} {"_id": "49093", "title": "Adding URL parameters to Google Webmaster Tools: URL encoded or not?", "text": "When adding parameters to Webmaster Tools for URLs that I don't want indexed, should I add the parameters with actual brackets, example: technology[above_average] or should add in the URL encoding characters which are showing in the SERPs URL like so: technology%5Babove_average%5D Thoughts?"} {"_id": "21630", "title": "Whereto should I migrate my dasBlog site?", "text": "For a couple of years I've been hosting my blog on my own server. However, I'm getting tired of maintaining it myself, and I also wish to move away from the dasBlog engine, since development of that engine has stopped. For those reasons, I'd like to move the blog to a service that I don't host myself. It doesn't have to be free. However, I'd like to **move all the content** of the old blog so that **permalinks will still work** (redirecting to a new URL is also perfectly acceptable). The same goes for the RSS feed and comments. I was looking at wordpress.com, which looks good, but before I jump into that wanted to ask whether there are other options I should consider?"} {"_id": "2010", "title": "Favicon, icon, shortcut icon, apple-touch-icon: Are there any others?", "text": "I know of 2 basic kinds of default web icons for websites: * **Favicon**: place an `.ico` formatted file at the root of your site named `favicon.ico` and most desktop browsers will find it and display it in various contexts (bookmarks, tabs). Can also be added in code, which allows you to specify other names and other graphics types to the file, though there are type and size limitations for browsers: * **Web Clip Icon**: place a `png` formatted file called either `apple-touch-icon.png` or `apple-touch-icon-precomposed.png` at the root of your site and iPhone OS/iOS versions of Safari will pick those up and store them when the site is added to the Home Screen. Can also be added with code like: **Question** : Are there any other default icon graphics for a website a webmaster can include?"} {"_id": "21634", "title": "Paypal PDT and IPN , how does it work?", "text": "## PDT Payment Data Transfer is getting the transaction data of the purchase that was made on paypal site and you want to fetch that on your own site and display to the user. Also you may want to store it in your database for archive and tracking purposes. But I cannot exactly follow the documentation here What I am not getting is > Once you have activated PDT, every time a buyer makes a website payment and > is redirected to your return URL, a transaction token will be passed along > as a \"GET\" variable to this return URL. In order to properly use PDT and > display transaction details to your customer, you should fetch the > transaction token, variable name \"tx\", and retreive transaction details from > PayPal by constructing an HTTP POST to PayPal. > > Your POST should be sent to https://www.paypal.com/cgi-bin/webscr. You must > post the transaction token using the variable \"tx\" and the value of the > transaction token previously received (e.g. \"tx=transaction_token\"), and the > special identity token using the variable at and the value of your PDT > identity token (e.g. \"at=identity_token\"). You will also need to append a > variable named \"cmd\" with the value \"_notify-synch\", for example > \"cmd=_notify-synch\", to the POST string. ## IPN I have setup Instant Payment Notification through setting according to this documentation. This is basically logging into your paypal account and enable IPN while specifying a url where the notification will be sent. This is used to complete an order so that the product can be shipped. What I did is setup a PHP page. I have created a table and whenever that page is called (or hit), it registers an entry in the table so I know a notification came from Paypal. But it does not work either. What am I really doing wrong? The first thing I want to trouble shoot though is when the buyer pays the amount, he is automatically redirected to my site. I have enabled this but automatic redirection just does not work. Instead he is shown the url as an option after payment confirmation is shown. Can someone guide my how the PDT process goes? Where do I make the request for PDT, is it along the very first request (Buy Now button) or it is sent later? ## Addition I found some good sampling code of how everything should work but it still does not work. I use this code http://officetrio.com/modules/free-php-paypal- ipn-script.php for IPN. I am using this for PDT. This one uses SSL, I changed SSL to regular HTTP (copied paypal version), still does not work. http://ykyuen.wordpress.com/2010/02/17/paypal-payment-data-transfer-sample- code/"} {"_id": "47352", "title": "Is using .se, Sweden's top level domain name, a bad idea for a website hosted in the United States", "text": "I have a client with a website name that ends with se. I thought instead of dealing with .com domain names I would register something like gardenho.se. Is there anything wrong with this? Does this impact SEO? Can I simply have the one name and then redirect it to the .com domain name so that it can be used on business cards?"} {"_id": "3688", "title": "Are there discussions about developing a user contributed website?", "text": "I'm developing a website where users will be the ones who add new input, validate new suggestions and correct incorrect information. (Much like this site). To make sure that abuse of these rights by users is minimized, I need to develop a good reputation system which encourages good behaviour, much like this site's reputation, again. I would like to learn more about this subject, which is why I ask: are there any discussions about the development of user contributed websites? Any resources, studies or anything else that could help me out to develop a system that will work? Thanks in advance. - Tom"} {"_id": "36338", "title": "Visitors have old website cached in their browsers", "text": "My client's new website is example.com, the old website is example.co.uk. I've re-pointed the A Records to the new website (so as to leave the emails alone) and put in 301 redirects from old pages to new pages. But, my client is upset as he (and he thinks many of his clients) have the old website cached in their browsers and won't know how to clear their browser cache. Is there anything I can do to overcome this and if not, what sort of time will browsers finally stop using their cached pages so I can at least go back to my client and tell him that his clients will finally start to see the new website?"} {"_id": "28877", "title": "How to use Google Analytics as an affiliate to track sales data", "text": "As an affiliate, how can we get more information on sales? It looks like the goals feature in GA is for those who have control over the receipt page. But we are sending users away using an affiliate link. With event tracking, we've been able to count the clicks and see which links are being clicked the most, but not which ones actually convert. We want to find out the following on each sale: 1. Did the converted user come from search or internal traffic? 2. If it was search, which keyword brought the user to our site (and clicked away and converted)? Is it possible?"} {"_id": "3683", "title": "Link popularity check", "text": "Where do i search to check link popularity for my site? On a different site i get different counts. Besides, what is the difference if check for link popularity using `http://article-stack.com/` or `article-stack.com`? In 2nd case it shows higher counts."} {"_id": "28875", "title": "Redirect Permanent and https", "text": "I just set up HTTPS on my server, and I have an issue with redirect permanent. Example `http://domain.com/index.html` it redirect me to `http://www.domain.comindex.html` The **/** (tail ending slash) is missing and I can't figure out how to fix it. It's work with `http://www.domain.com/index.html` Here is my httpd.conf ServerName domain.com Redirect permanent / https://www.domain.com/ ServerName www.domain.com Redirect permanent / https://www.domain.com/ DocumentRoot /var/www/domain/ ServerName www.domain.com SSLEngine on SSLCertificateFile ssl.crt SSLCertificateKeyFile ssl.key "} {"_id": "43141", "title": "Backlinking to a website with image source", "text": "I just want to ask can we increase backlinks to our site by putting 1x1 pixel image on our clients' sites or can anybody suggest a better solution? http://www.mysite.com/"} {"_id": "2519", "title": "Where should I look for a windows based CMS?", "text": "Ideal features: * windows * allows for localization * allows us to make it look exactly like the current http://fogcreek.com We're currently using ASP.NET, which means you have to check out the website to make any changes (and so our front page has stuff from a year ago on it). Lame. Can be free or paid. Any suggestions?"} {"_id": "43143", "title": "How to autopause Google Adwords if server is down?", "text": "Should I write my own monitoring script and use Adwords API to pause the campaigns if my server is down, or there is a simple way to do it? I don't want to use 3rd party software, because the task looks rather simple for that."} {"_id": "43142", "title": "HTTP Headers caching", "text": "I am not totally sure bout HTTP headers, but from what I read its good to have some level of caching on static pages also I am not sure if `Transfer Encoding: chunked` is a good thing. I was not finding a definite answer as to how best to run this cache with my PHP files so that caching is enabled and when content changes it should update the cache. Hopefully I can get assisted with this, I was wondering which of the following would be best to use or any advice : header('Cache-control: public'); OR header('Cache-control: max-age=10'); Thanks for your time."} {"_id": "2514", "title": "How do you get new sites into your Quantcast network?", "text": "Over a month ago, we added some of the new Stack Exchange sites to the \"Stack Overflow Network\" on Quantcast, but to date, they haven't showed up: http://www.quantcast.com/p-c1rF4kxgLUzNc#subdomain Is this because of traffic numbers, or because we did something wrong?"} {"_id": "2515", "title": "Why would Google's ranking algorithm move a search result due to quotes?", "text": "Yesterday at 10AM, the Google search: > \"Submitting values of \"jQuery Editable Invoice\" into a MySQL DB using PHP\" came up with `stackoverflow.com` as the 8th result on the page. The first was a spammy page which was just copying the info from `stackoverflow.com` without attribution [contacted and they fixed it]. Removing the quotes surrounding the question > Submitting values of jQuery Editable Invoice into a MySQL DB using PHP `stackoverflow.com` came up first. Removing just the inner quotes > \"Submitting values of jQuery Editable Invoice into a MySQL DB using PHP\" `stackoverflow.com` came up first again. At 3PM yesterday, all results for those 3 combinations showed `stackoverflow.com` in the first result. What could have caused that?"} {"_id": "2516", "title": "Is there a jQuery lightbox plugin that has thumbnails inside the box?", "text": "I'm trying to find a lightbox style plugin in jQuery that displays thumbnails at the bottom of the picture (or video, or whatever). I found plenty of lightbox plugins, but none that allows navigation by thumbnail _inside_ the box. The plugin must be able to open images, swf and inline content. I realize that jQuery will not be able to generate the thumbnails and that's ok. Do you have any good ones to suggest? Thanks"} {"_id": "64927", "title": "Do search engines lower your ranking for using private registration like WhoisGuard?", "text": "I am registering a new domain name from \"namecheap.com\" which offers free WhoisGuard for a year. There are numerous reasons not to list your contact information with your domain registrar, such as blocking contact information from telemarketers, and email spammers. Google can only see registration information on domains that they hold. Does blocking WhoIS users from accessing your contact information via privacy services (WhoisGuard) have a harmful effect on SEO and lead to search engines not trusting you? WhoisGuard References: http://www.whoisguard.com https://www.namecheap.com/security/whoisguard.aspx"} {"_id": "64920", "title": "anonymous.google as a site name placement in Google AdWords?", "text": "We started a campaign on Google AdWord's content network. Using Value Track, they let you use `{placement}` in your ad URL so that the name of the site where your ad appears is passed as a URL parameter to your site. When we use this, we normally see the site name (like `example.com`) passed as a URL parameter. Some portion of the time, though, Google sends something like `123456.anonymous.google` as the placement. It appears that Google allows sites to be anonymous from those advertising on them some of the time. * Under what circumstances does google anonymize the placement like this? * Why does Google anonymize some placements? * Does google use a different anonymous identifier for each site that sends traffic? I've found that I can block ads that would show up `anonymous.google` if you want to."} {"_id": "2512", "title": "How can I secure an installation of MediaWiki?", "text": "I want to run an installation of MediaWiki as a Internet-accessible personal wiki, running on wiki.mysite.com. However, I want to ensure that I am the only one who can read and write to this wiki. In the future, I may explicitly give other people read and/or read/write access, so the method of securing the wiki should account for that as well. I see two options: I can use some MediaWiki plugin or I can secure the subdomain with HTTP authentication. However, I'm not sure what the advantages and disadvantages of either are in the long run. Suggestions or advice as to what plugins or authentication methods might be most reliable?"} {"_id": "2513", "title": "What will happen if I transfer the registration of the primary domain on my 1and1.com account?", "text": "I have a hosting account with 1and1.com, and several domain registered and hosted with them. I need to transfer the registration of the **primary domain** on my account to a new registrar, but I don't really want to move all of my domains with them. Will transferring only the primary domain have any adverse effect on the other domains hosted on that account?"} {"_id": "49326", "title": "Emails have disappeared from 1 account on mail server (WHM/LAMP VPS)", "text": "I just received a call from a client saying that all their emails have dissappeared from their inbox. I logged on to their webmail account via CPanel and RoundCube and this appears to be true - there are emails in sent, drafts, junk and deleted but nothing in the inbox. There is plenty of room on our VPS and the client has plenty of quota. I sent a test email and it did appear fine in the webmail server. What could have caused this and is there any way to retrieve it?"} {"_id": "46845", "title": "New site has PR9 as soon as it is made, why/how?", "text": "I was just messing around with Google Sites and noticed that as soon as I made my site the homepage had a PR of 9 (before it crosses your mind all the links it makes are `nofollow`). The homepage of the test site I was playing around with is https://sites.google.com/site/samsnewsite030413/ Why does this happen as PR is on a per-page basis, and as the page is brand new how does it have a PR? I've seen a similar thing happen with Github project pages; they seem to have a PR8 as soon as they are made, it's not like they are running off a standard URL with JavaScript bringing in the content - they seem to be static pages with their own URLs."} {"_id": "42813", "title": "buttons (text or images)", "text": "I would like to know if I should create CSS buttons or image buttons for my site. What is better? Problem is that the text is formatted with Helvetica Neue ( I like that font very much) and I can't use it as a web font because of copyright issues. If I create the buttons in CSS they look beautiful in the browser but get resized when the user has chosen a different text-size for their browsers. Image buttons have the advantage I can embed my font but do not look that good as CSS buttons. Also in that case I had to make Retina-versions for the buttons (just for the future when any PC has a Retina-display). Can someone please help me in this tricky situation? Does anyone know a web- safe alternative to Helvetica Neue? Should I disable the resizing of my button div-containers? What's today's standard when it comes to website buttons? TIA!"} {"_id": "49323", "title": "To allow or disallow alphabetical content directory pages?", "text": "We have a content that publishes fresh content 2-3 times a day. The site has category pages for tags and each alphabet. They both list the title with links to all the content within that category or starting with that alphabet respectively. Should you allow the robots to index these pages specifically the ones that list titles of the page starting with each alphabet?"} {"_id": "46842", "title": "How to fix this 404 soft error?", "text": "I have a static HTML website. `www.example.com/?12345` (this page doesn't exist) redirects to `www.example.com` and `www.example.com/page.html?12345` redirects to `www.example.com/page.html`. I don't know why this happens. Google said this is a soft 404 error and `www.example.com/page.html?12345` should return a 404 response not a 200 OK response. How can I fix this ? Here's my _.htaccess_ : RewriteEngine On RewriteCond %{HTTP_HOST} !^www\\. RewriteRule ^(.*)$ http://www.%{HTTP_HOST}/$1 [R=301,L] # Cache # 1 month Header set Cache-Control \"max-age=2592000, private\" # 3 days Header set Cache-Control \"max-age=259200, must-revalidate, proxy-revalidate\" # 10 min Header set Cache-Control \"max-age=3600, public, must-revalidate, proxy-revalidate\" # Include php AddHandler application/x-httpd-php5 htm "} {"_id": "49897", "title": "Is this a correct robots.txt file?", "text": "I would like to allow Googlebot and Mediapartners-Google (AdSense useragent) to crawl my website. So I have written below code inside my _robots.txt_ file. User-agent: Googlebot Disallow: User-agent: Mediapartners-Google Disallow: Sitemap: http://website.com/sitemap.xml Is the above _robots.txt_ file is correctly written? Yes or no ?"} {"_id": "52600", "title": "Should I have og:description and meta description together on every page?", "text": "Or will Google not like it if I have two kinds of description tags? I just put up a new blog and it automatically adds the `og:description` and `og:title` and I am not sure if that is going to get me penalized or my pages somehow discounted if I have meta description and `og:description` on the same page. Would anyone know how to best handle this situation? Should I try to get rid of the `og:description`? The site I am working on is a WordPress site."} {"_id": "46848", "title": "Backslash character %5c in URL makes relative links fail", "text": "Please can anyone help? Somehow Google has indexed the page: http://www.wiseowl.co.uk/courseware%5Cms-office.htm This looks fine, but the page should be: http://www.wiseowl.co.uk/courseware/ms-office.htm All works well until you click on relative links on the page. For the first (backslash) case, these fail, as the path takes you one folder too far up the directory tree. Our website is hosted by IIS, so no Apache style commands are possible. Has anyone any ideas? Thx in advance Andy"} {"_id": "68110", "title": "How can I view website requests in console in real time?", "text": "I want to display incomming requests to my web server in real time, in console. Each request should be represented by a single line with information about the request, i.e. client\u2019s IP address, etc. Is it possible? If so, how?"} {"_id": "58059", "title": "Correct usage of schema.org for logo?", "text": "Google gives this example to markup your logo: http://googlewebmastercentral.blogspot.ca/2013/05/using-schemaorg-markup-for- organization.html But that example has the `img` outside the anchor, I want to put it inside...so I'm wondering can I do it this way instead: Is this valid? _Note: I took out the URL`itemprop`_."} {"_id": "5058", "title": "Does pinging the page on social bookmarking sites whre your site has been bookmarked help SEO?", "text": "If you bookmark your site on say, delicious, does it help to then ping the url on delicious that contains the bookmark to your site?"} {"_id": "68117", "title": "Web Notification Design", "text": "I am designing a website which my web user would interact on the site. Take FB poke function for example. There are two men A and B and they are both online. And A poked B. What I want to know is how to make B get an instant notification, knowing that A poked him just now ? The way I thought is that make a javascript code to ajax the database for new notification every minutes. How could I reduce the amount of the times of query the data ? Or is there any other way better?"} {"_id": "3536", "title": "Online Website Backup Solution?", "text": "i am managing Lil bit big online teaching website(1gb). it was bit pain when we talk about getting backup from the my Host. and my internet speed also dam slow. because of it i am currently use my hotfile premium account Remote upload facility to back up my site. but the problem is these hotfile backup files life time is too short(30 days). i need alternative for this. my requirements. 1. ability to upload 1gb file. 2. Remote upload facility smiler to hotfile.com 3. long life time for files(3-6 months). 4. good download speed smiler to hotfile.com 5. can pay nearly $10 per month please help me to solve this."} {"_id": "5055", "title": "Multiple Wordpress installs on subdomains affecting SEO?", "text": "I am working on a website for a photography business in town and was asked to do something I'd never considered before. The company wants a basic webpage for it's \"general\" business -- myphotography.com, let's say -- and then a subdomain for each of their specialties: weddings.myphotography.com, portraits.myphotography.com, etc. They specifically requested that Wordpress be used and want a different theme on each subdomain. What makes most sense to me with this setup is using multiple Wordpress installations, one for each subdomain, and interlink between them with the menus and trackbacks/pings. The last thing I want is to negatively affect SEO with this. It seems like it should boost each individual subdomain/site. But at the same time, the root domain is the same, so it could actually drag the entire thing down. What are your thoughts on this??"} {"_id": "5056", "title": "Is there a simple way to query the top URLs in Google Analytics?", "text": "I'm looking for an easy way to query GA programmatically in order to display the most popular pages on my site. GA has a Data Export API, but it seems kind of complex to work with for what I want to do. Is there a simpler way or am I stuck with the Data Export API?"} {"_id": "34626", "title": "Internet data citation rules", "text": "I have a site and on some pages I want to add data that is available on other sites, also I will put proper reference to my data that is where I copied it from. I just need proper rule for it. Like http://www.prweb.com/releases/2012/4/prweb9358675.htm I want to add an excerpt from this page and refer this page as reference reading. Can I do copy paste? Is this any violation of law?"} {"_id": "5051", "title": "How to use google adsense (or similar) for a web app with user specific content", "text": "I am working on a site that allows users to log in and enter their own information. Each user will have their own login, and enter their own content that can only be seen by them. It is conceptually similar to a to-do list site like Remember The Milk. I would like to use Google Adsense as a way of getting revenue from it, but from what I have read so far it has no way of handling this kind of site. What I would want from a service like Adsense is a way to submit a set of keywords that I can extract from the user's text, and get back a set of adverts to display to the user. There is Adsense's search facility but that has to be submitted by the user themselves - I expect if the server submitted search requests automatically it would be a fast track to getting banned from Adsense. Is there any way to do what I want using Adsense, or are there any competing ad services that will do what I want?"} {"_id": "5052", "title": "amung.us and \"users online\" time window", "text": "Greetings i would like to ask you what is the time window that amung.us calculates \"online\" users. 1 hour? 30 min? 15 min? other?????"} {"_id": "58054", "title": "Why would layout differ on first-load of a webpage?", "text": "Not entirely sure I understand what's going on, but would appreciate some pointers or things to try. (In Chrome/Mac...) In an incognito window, try loading http://ireland.media.info - and look at the search box on the right-hand side. It's falling off the bottom of the top menu bar, isn't it? Looks really ugly, doesn't it? Now, hit reload. Look at the search box on the right-hand side. Oh. It's displaying nicely in the middle of the top menu bar, isn't it? I've been writing HTML for twenty years, but never seen this. Is it, perhaps, a bug related to the images on the left, which are browser re- sized (so they look pretty on a retina screen)? Is it a bug at all? Are you seeing this issue on any other browser or any other build of Chrome for another OS? (Just so you know: the HTML should be identical for each load, and, indeed, should be cached at the server using Varnish anyway.) Any clues?"} {"_id": "60465", "title": "Is there any SEO benefit in making website image paths relative to the current page", "text": "Is there any benefit in making the paths of content images match that of the page? E.g. if the path of my page is www.example.com/something/my-latest-article is it better to have content images with the URL such as www.example.com/something/my-latest-article/content-image-1.jpg or www.example.com/images/content-image-1.jpg"} {"_id": "38568", "title": "difference in using third party social buttons and directly integrating each social buttons ourselves", "text": "I wanted to add specific social buttons to my article. I used ShareThis. It gives a facebook like button, google plus button, etc... by default. were as in other articles of different modules i had integrated the facebook like by myself by following the documentation (including markup in the head section) What is the difference in adding manually with many markups and using third party code? Will that affect SEO or any other advantage over the respective social networking site (here for example facebook and google plus)?"} {"_id": "60461", "title": "Domain not accessing correct files in apache2 virtual hosts", "text": "I have 3 domains hosted on my Ubuntu server running apache2 as DNS I am using cloudflare.com abc.com and xyz.com are accessing correct files that are inside abc.com/public_html and xyz.com/public_html folder but the third domain is accessing files from the root folder. Following is the content of my domain.com file which is found under /etc/apache2/sites-available/domain.com ServerAdmin webmaster@domain.com ServerName domain.com ServerAlias www.domain.com DocumentRoot /var/www/domain.com/public_html Options FollowSymLinks AllowOverride None Options Indexes FollowSymLinks MultiViews AllowOverride None Order allow,deny allow from all ScriptAlias /cgi-bin/ /usr/lib/cgi-bin/ AllowOverride None Options +ExecCGI -MultiViews +SymLinksIfOwnerMatch Order allow,deny Allow from all "} {"_id": "28544", "title": "Blocked urls by Google", "text": "I have a website which is 2 years old now. I saw it after a long time now and was shocked to see that Google robot has blocked 104 pages of my website. I saw this through webmaster tools. IS there any way I can unblock them? EDIT: I have attached the image but can't provide the name of the website. Thanks ![enter image description here](http://i.stack.imgur.com/hbnHO.png) Thanks in advance."} {"_id": "3079", "title": "Should I use a file extension or not?", "text": "I've always wondered about this and never found a good solution. But this question reminded me of it. When I have a URL on my website it can be displayed and accessed any of the following ways: http://www.somesite.com/subdirectory http://www.somesite.com/subdirectory/ http://www.somesite.com/subdirectory/index.htm http://www.somesite.com/subdirectory/index.html http://www.somesite.com/subdirectory/index.php http://www.somesite.com/subdirectory/index.asp http://www.somesite.com/subdirectory/some-relevant-keywords http://www.somesite.com/subdirectory/some-relevant-keywords.htm http://www.somesite.com/subdirectory/index.php?page=some-relevant-keywords http://www.somesite.com/subdirectory/?page=some-relevant-keywords http://www.somesite.com/subdirectory/?page=some-relevant-keywords&even=more-keywords etc... Now, I can understand the merits of adding keywords in the URL. Even the most basic SEO guide will mention to do just that. ... but for the sake of sanity, clarity, ease of reading, ease of use, and so on, **including web compliance** ... Is it **_preferred_** to have a file-extension or not? Really, deep down my logic tells me: yes, it should. The reason being is this stems back to the days of the past when the internet was mostly USENET, FIDONET, FTP and GOPHER. See, if a URL has no _filename_ , then it normally is considered a _directory_. This is where index.htm came about, because this by default lists the directory if no index file is found. However, soon enough, web programmers started overriding this and using index.htm to actually serve the content of that web directory _as a page_. The main difference, was markup language was added in, and this was parsed in the browser. With this markup language, the `Content-Type:text/html;` tag in the response header became the indicator to what filetype it was **for any file**. HTML seems to be the only \"filetype\" that just doesn't have consistently named extensions, except for when they are saved. Unfortunately, once web pages became the main thing, it became a security error to actually display the directory contents, so everything stayed hidden with only the actual URL content being displayed. Not to mention the cross-platform file-naming wars.. windows based require a 3 or less digit extension, and unix/mac can have more. So should it be `.HTM` or `.HTML` or `NONE` and let the platform decide? So in essence, I guess what I am trying to figure out is _beyond SEO_ and dealing more with aesthetics and web compliance."} {"_id": "7168", "title": ".html extension or no for SEO purposes", "text": "> **Possible Duplicate:** > Should I use a file extension or not? I know this question has been asked before on Stack Overflow, but what I have not been able to find in the posts I've read are concrete references as to WHY one is better than the other (something I can take to my boss). So I'm working on an MVC 3 application that is basically a rewrite of the existing production application (web forms) using MVC. The current site uses a URL rewriter to rewrite \"friendly\" urls with HTML extensions to their ASPX counterpart. i.e. http://www.site.com/products/18554-widget.html gets rewritten to http://www.site.com/products.aspx?id=18554 We're moving away from this with the MVC site, but the powers that be still want the HTML extension on the URLs. As a developer, that just feels wrong on an MVC site. I've written a quick and dirty HttpModule that will perform a 301 redirect from the .html URL to the same URL without the .html extension and it works fine, but I need to convince management that removing the .html extension is not going to hurt SEO. I'd prefer to have this sort of friendly URL: http://www.site.com/products/18554-widget Can anyone provide information to back up my position or am I actually trying to do something that WOULD hurt SEO, in which case can you provide references on that?"} {"_id": "42210", "title": "Does Google treat .html pages differently from .php pages?", "text": "> **Possible Duplicate:** > Should I use a file extension or not? My current hosting provider does not let me put PHP code inside of a .html page. So part of my content is in a database and displayed with PHP, but most of it is in pre-rendered .html files. I'd like to put more of it in .php files however I am concerned that search engines will penalize my .php pages because they are PHP. Is there any evidence so that effect? Thanks."} {"_id": "7797", "title": "SEO .html, .php or nothing", "text": "> **Possible Duplicate:** > Should I use a file extension or not? Hi Guys, When I make a site, should I use the .php extension, rewrite it to .html or just remove it (.htaccess)? What's your opinion about this? I've always learnt that it was .html, but I don't know why. I doubt it that google doesn't care, but why should .html prefered to no extension?"} {"_id": "65298", "title": "Does .html extension in URL help for SEO?", "text": "I am working on `dotNET` and I have changed `URL` extension from `.aspx` to `.html`. Is it good to use .html extension ? My actual URL is `www.example.com/Project.aspx?Id=2&Type=this-is-something` I have replace this as `www.example.com/Projects/2/this-is-something` Is it fine ? Else give me suggestion which `URL` should I use for `SEO`. I can make URL anyhow I want using URL Rewriting. I want to make it best URL for SEO. Please help me regarding this."} {"_id": "38565", "title": "403 Error Crawling Pages", "text": "I am receiving more than 6000 errors in Google Webmaster Tools. It is showing \"access denied\"; can anyone please help me out with resolving this?"} {"_id": "38567", "title": "What's the relation between securepaynet and GoDaddy?", "text": "I created an account in Google Apps, and during the registration I also created a GoDaddy account (within Google Apps page). And, the username and password Google provided me will only login on this securepaynet settings page, not on GoDaddy's official page. So, what's the relation between securepaynet and GoDaddy? Why can't I login to the GoDaddy page (that have more configurations options)?"} {"_id": "23348", "title": "Ways to set up a simple inventory form quickly", "text": "I needed a simple inventory form of sort on my website where visitors could enter their information, pick the item they wanted and then submit the form and dispatch an email. The information will be updated in a database and then I can just verify the records in the database from another form when I confirm their payments manually. Since it isn't a full scale inventory or online shop system and it is going to be just a temporary thing, I feel like I want to do something like that in the shortest time. If create this thing in PHP I thought it may take quite some time to set up the forms, put in the form validations, and then programme the form logic with the inputs and database, etc. Are there other quicker and smarter solutions to create such a simple form quickly?"} {"_id": "53751", "title": "Web.config WordPress rewrite rules next to Magento", "text": "I've installed Magento on IIS in folder: `E:\\mydomain\\wwwroot` (I already have it all running correctly). I have no deeper folder `magento`, I placed all files directly in the `wwwroot` folder, so: wwwroot\\app wwwroot\\downloader wwwroot\\errors wwwroot\\includes etc... **UPDATE** : since I'm on IIS my `.htaccess` is ignored completely and my `web.config` rules are used instead. Here's my web.config in folder `e:\\mydomain\\wwwroot`: Next, I wanted to install WordPress. I unzipped all files in folder `e:\\mydomain\\wwwroot\\wordpress` Browsed to www.mydomain.com/wordpress/wp-admin/install.php, where I configured everything for my database. Everything was installed correctly. I then navigate to http://www.mydomain.com/wordpress/wp-login.php where I type my credentials. I seem to be logged in and am redirected to http://www.mydomain.com/wordpress/wp-admin/ But there I receive an empty page. I enabled detailed error message in IIS following this article: http://www.iis.net/learn/troubleshoot/diagnosing-http-errors/how-to-use-http- detailed-errors-in-iis I also checkec with Fiddler and see that I receive a 500 error: GET /wordpress/wp-admin/ HTTP/1.1 Host: www.mydomain.com Connection: keep- alive Cache-Control: max-age=0 Accept: text/html,application/xhtml+xml,application/xml;q=0.9, _/_ ;q=0.8 User-Agent: Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/29.0.1547.76 Safari/537.36 Referer: http://www.mydomain.com/wordpress/wp-login.php Accept-Encoding: gzip,deflate,sdch Accept-Language: en-US,en;q=0.8,nl;q=0.6 Cookie: wordpress_fabec4083cf12d8de89c98e8aef4b7e3=floran%7C1381236774%7C2d8edb4fc6618f290fadb49b035cad31; wordpress_test_cookie=WP+Cookie+check; wordpress_logged_in_fabec4083cf12d8de89c98e8aef4b7e3=floran%7C1381236774%7Cbf822163926b8b8df16d0f1fefb6e02e HTTP/1.1 500 Internal Server Error Content-Type: text/html Server: Microsoft- IIS/7.5 X-Powered-By: PHP/5.4.14 X-Powered-By: ASP.NET Date: Sun, 06 Oct 2013 12:56:03 GMT Content-Length: 0 My WordPress web.config in folder `e:\\mydomain\\wwwroot\\wordpress` contains: I also want my WordPress articles to be available on `www.mydomain.com/blog` instead of `www.mydomain.com/wordpress` Ofcourse my admin links for Magento and Wordpress should also work. How can I configure my web.config files to achieve the above?"} {"_id": "52057", "title": "Line spacing does not go to zero", "text": "I have reduced the font size to very small here but large interline gaps remained. Additionally, it does not seem to be monospace since length does not increase monotonously as it should in case of font-family: \"Courier New\",Courier,monospace. In edit mode, everything is fine."} {"_id": "29607", "title": ".htaccess Redirect Old Threads", "text": "I have moved to mybb to vanilla and want to redirect old threads to the new format, so the indexed threads on google aren't lost. The old format is > showthread.php?tid=3003 Whereas the new format is > index.php?p=/discussion/3003 How would i redirect this via .htaccess?"} {"_id": "52813", "title": "Tackling thin content on an images gallery", "text": "We run an images gallery as part of our site, however we have over 8,000 images and every image has a separate HTML page of its own to display the image caption, related image and comments from users of the site. This seems to be a problem especially with the Google Panda update because these pages are technically \"thin content\". What would be the best way to tackle this? We'd love some feedback and advice regarding this scenario. We have a few options we thought of already but can't decide: 1. We could `noindex` the separate image pages and lose any image search listings we have for the image in favour of removing these thin pages from the index. 2. We could 301 all of the individual image pages back to the image category listing and anchor each image (e.g. `#img2122`) and include all of the comments and description on the category listing page itself. If we was to simply list all of the images and content on the category pages themself; what's the best method? We could add all of the content in the anchor tags and use jQuery to display them in a box when a user clicks on the image or we could use Ajax to retrieve the information. However, what's the best Ajax method for SEO? Any ideas, suggestions, tips or advice is greatly appreciated and thank you in advance for any given."} {"_id": "53757", "title": "Google+ and authorship of brand pages", "text": "Does Google+ support authorship for brand pages (e.g. not a personal Google+ profile). It is unclear on the context. E.g. here it says you need a recognizable profile photo of a face: https://plus.google.com/authorship But on the examples they have ZDNet logo: https://support.google.com/webmasters/answer/1408986?expand=option2 What would be the authorship markup for Google+ pages as page urls differ from profile urls?"} {"_id": "10766", "title": "SEO for different content displayed via PHP", "text": "Do search engines index, crawl, and display content that is shown only to certain users via php? For example, I have a page which looks like this: some content1 _php if user is x user_ some content2 for user x _php if user is y user_ different content3. So, would the search engines consider the keywords/links/seo of all three types of content or just that displayed to all users? Thanks"} {"_id": "28879", "title": "Force google to just index rewrited urls", "text": "Let's think we have a website with a URL like `http://www.domain.com/webdesign/` that rewrites `http://www.domain.com/?article=1`. now when i search my keyword in Google, it will show up `http://www.domain.com/?article=1` in search results, while there is no link to that in any page and all menus are linked to `http://www.domain.com/webdesign/` and i want that to be index. _**Is there anyway to solve this?_**"} {"_id": "64903", "title": "Requirements for Google HTML snapshots", "text": "Are there any requirements for a HTML snapshot for Google other than: > .. must contain the same content as the dynamically created page .. found here: https://developers.google.com/webmasters/ajax-crawling/docs/html- snapshot?hl=nl There is also a question found here: Content of html snapshot for Search engines But is does not realy answers my question. So what about the structure and layout? Or can i just make a stripped down version with plain HTML."} {"_id": "18893", "title": "What is the Google version of Yahoo's Data tables?", "text": "I find Yahoo! Data tables very useful. You can access them using YQL (Yahoo query language) to retrieve information such as stock markets, currency rates, etc. See http://developer.yahoo.com/yql/console/ for more information. I was wondering, though, whether Google has anything similar?"} {"_id": "14641", "title": "What is best for SEO - 'RDFa Semantic Web' or 'Link Building'", "text": "Given that RDFa is fairly sensible and Google use it, should I plug in some sensible semantic web tags into my site or should I invest my energies into link building? I was also planning to use this to put my semantic markup together. I don't think the site is important to the question, however it is a relaunched hotel website."} {"_id": "18897", "title": "Beginners resources for SEO", "text": "I'm totally new to the business of SEO - what are good resources to get started? When I try to Google such a question I end up quickly at black-hat and paid sites; I'm looking for the white-hat variety."} {"_id": "14647", "title": "How to incorporate WordPress blog into a static website?", "text": "I'm a designer and front-end web designer and I have a client that wants me to put her wordpress into a static site. Is there a way that I can embed the blog into a new site? Or is there a way that I can create a back door so that even though she cannot read html she can add new updates to the site? I've been looking for the answer, but can't seem to figure it out. Please let me know what my options are."} {"_id": "45566", "title": "Alexa ranking of a website?", "text": "If a user visits demo.domain.com, does that count as having visited the site domain.com? Does the Alexa ranking apply to the sub-domain or the primary domain?"} {"_id": "58467", "title": "Is an anchor link registered as duplicate content by Google?", "text": "I have a store where fairly important content is now viewed by all visitors (only 66% according to mouseflow) as users have to scroll a bit to see it, which not all do. I don't wish to move the content up at this time, but wanted to add an anchor to all source links that would skip the header of the eshop and go directly to the content. This would provide 100% views of what I want. But will Google register anchor links as duplicate content of non-anchor links? Would I lose PageRank from this? `/food.html` AND `/food.html#bread` - duplicate content or not?"} {"_id": "21920", "title": "Can I modify an existing certificate to run on multiple hostnames?", "text": "I've been investigating and I found instructions on how to create an SSL certificate with multiple hostnames, but I need to know whether I can modify an existing certificate (not create a new certificate) to allow multiple hostnames."} {"_id": "58948", "title": "URL doesn't change on sub-pages or when leaving site", "text": "I'm new to the world of webmasters, so I haven't been able to appropriately search for an answer. I don't know the keywords that describe my situation. I registered hallienoelle.com and the `.net` version through GoDaddy and set up forwarding with masking to hallienoelle2.wpengine.com. When you go to the `.com` or `.net`, it takes you to the homepage with no slashes or anything after it. If you click on a different page, it takes you to that page, but doesn't change the url. It still says `hallienoelle.com` with no slashes or anything after it. The _most worrisome_ part is that if someone clicks an external link **my URL is still in the address bar.** For example, clicking the _Powered by Wordpress_ link at the bottom takes you to `wordpress.org`, but the address bar still shows `hallienoelle.com` with no slashes or anything after it . Did I do something wrong in setting up GoDaddy forwarding or is it a Wordpress/WPEngine configuration issue? **UPDATE:** Thanks to PatomaS I know how to fix the external link problem. I still am unsure of how to get the directory/folder/webpage to display after the `.com` like so: 1. Go to `hallienoelle.com` (and see the content from `hallienoelle2.wpengine.com`) 2. Click a link, like `About Us` 3. This should take you to `hallienoelle.com/?page_id=11` (and you see the content from `hallienoelle2.wpengine.com/?page_id=11`) 4. **But,** currently it still shows just `hallienoelle.com` even though you are looking at the `?page_id=11` page. Do I have to go through all my internal links and hard-code something like `About Us`? This seems inefficient and error prone. I'm probably just misunderstanding something. **UPDATE 2:** WPEngine also gives me the option of forwarding to `http://198.58.98.52/` and I (think) it reads the requested URL from the header and serves my content. I add this because as I've been reading someone said redirects to `mysite.example.com` are generally used for free hosting providers and are looked down upon. I use `hallienoelle2.wpengine.com` because it is recommended on `http://wpengine.com/support/how-to-configure-your-dns/` under the heading `Setting up a subdomain`."} {"_id": "14868", "title": "How can I browse the Web blind for a day?", "text": "\"Close your eyes\" isn't the answer I'm looking for, but +1 to you if it was your first thought. I'm putting together an awareness campaign called Browse Blind. The goals are to: * Help sighted people experience the Web as a visually impaired user for one day or more. * Show video footage of visually impaired users browsing popular sites such as Facebook. * Encourage webmasters to take a more active interest in making their sites accessible. * Demonstrate the features that make a site accessible, and those that impede accessibility. * Make the Web a better place. The project came about after a health scare that resulted in three trips to a local eye hospital. My sight is OK now, but it struck me that, as much as web designers and developers wax lyrical about accessibility, very few seem to have used a screen reader for any length of time (myself included) or considered what their online experience would be like if they suddenly lost their sight. And _not a single one_ of the 100 or so webmasters I've spoken to about accessibility have watched partially sighted users browse the Web. (It's one of the topics I always bring up at Web conferences.) My questions are: 1. **What is the most common method of Web access** (and the name of any related software) for partially sighted users? 2. **How can I simulate this method** so that a non-technical sighted user on any desktop platform can browse the Web in the same way as a partially sighted person with minimal setup steps? For question 2, I'd like to avoid recommending that people download and install a screen reader for their OS if I can, because I think doing so will drastically reduce the number of participants. So I'm looking for Web based simulators (and browser plugins if they exist). If you know of a cross-platform screen reader that's easy to set up, then that might be an option too, as would step-by-step guides that turn a Mac or Windows OS and browser into 'accessible mode' without further software, _as long as you're sure that this is a method that partially sighted computer users employ_. The closest in-browser screen reader emulator I've found so far is WebAnywhere. (Warning: audio plays as soon as you click that link.) It's technically very impressive, but it's a beta, and it doesn't yet appear to work reliably as a functional web browser. If there's nothing out there like this, that's fine as an answer too. (I'm prepared to build something that emulates the most common access methods, but I need to understand what those are first.)"} {"_id": "58238", "title": "Hiding Menu Item with CSS - will Google penalise my site?", "text": "I want to hide some sub menu items, using the CSS class, and assign it to the specific menu item. For example the class is no display: /* Hide menu items */ .nodisplay { display:none !important; } This works great, but I wonder if this would be penalized by Google, as I read that Google don't like hidden texts? I read that Google marks this as spam, is it true not? I have a look at the Gavick Menus and Helix Menus, they are using `display: none;` or `left:-999em;` too for making the elements. So my dilemma is if I use `display: none;` for hiding all my sub level 3 menu items, will Google penalize my site or not? I am asking this, because I saw a lot of drop-down menus to use this technique."} {"_id": "38623", "title": "How do I know my email open rate?", "text": "I saw some people saying their newsletter has X open-rate. How can I know this thing?"} {"_id": "21298", "title": "Most CSS Grid frameworks use pixel as css units, why?", "text": "I started to develop my own webdesign using an grid framework 960 CSS Framework and also noticed that most of other famous css grid frameworks use 940/960px as maximum page width? Some of them have an online generators where you can calculate and generate the same framework but for different width size. Can you tell me why they suggest 960 px as default? And more important: Why everything is measured in pixels rather than pt, cm, % or any other css units? **Edit:** Isn't it better to use 'in' as css unit and be sure that on every screen (computer, smartphone) it will have the same size? P.S. Some other grid css frameworks: * Blueprint * YUI 2: Grids CSS * Bootstrap * Skeleton"} {"_id": "33337", "title": "Tracking different types of users", "text": "I have an organizational web site that is used by both staff and clients. We've been looking at the analytics data to see how clients are using the site, but finding that some of the usage patterns are almost certainly driven by staff using the site. We would therefore like to somehow track the type of users. It would be fairly simple to have the server side code look at the current user's IP address to determine if it belongs to our staff network and inject this into the page, but I haven't seen any way to push this information into Google Analytics. To be clear, I do **not** want to eliminate the internal traffic from tracking. I'm still interested in it. I just want to be able to separate the two. Any ideas?"} {"_id": "38629", "title": "Webmaster Tools is throwing out 404 errors on link not on page", "text": "Webmaster Tools is showing thousands of 404 errors, where pages on the site are referring to another incorrect URL. For example, URL not found `www.example.com/shop/=`, linked from `www.example.com/shop/gift-voucher` and `www.example.com/shop/special-plant-offers`. I obviously have checked the source and cannot find any references to this link on any page. The only consistent issue is that it only seems to report this error on pages with two section i.e. `www.example.com/shop` does not report any error whilst all pages with `www.example.com/shop/xxx` (where `xxx` can be several different pages such as `gift-voucher`) all report this. I cannot seem to duplicate this error. I have run a link checker (we use Screaming Frog) and it does not report this error. I have fetched these pages as a bot, and these do not report this error. I am at a total loss. I cannot even duplicate the issue, but it is most definitely an issue, as Webmaster Tools is reporting new errors every day. Is this perhaps Google bot doing its own thing?"} {"_id": "45112", "title": "Proper name scrolling news/photo", "text": "I'm very new to webdesign and am wanting to install a scrolling news/picture banner in the site but don't even know the proper term to search for to find out how to do it. Here's an example of what I'm talking about: http://www.ed.gov/ and http://www.nysed.gov/ What is this called?"} {"_id": "37980", "title": "Do own brand organic searches improve your overall SEO?", "text": "I noticed something interesting at https://www.wisepricer.com/partner.php Every outgoing link is actually a Google Search, rather than a link to the partner's domain. So it got me wondering, can this behavior make the Google Algorithm give greater SEO juice than a direct link? And in general, if your domain is searched a greater number of times, does your overall SEO improve overall?"} {"_id": "13072", "title": "Should I set up an Ebay account besides my website ", "text": "I'm trying to increase the conversion rate for my website. Should I open an Ebay account and try to redirect customers from Ebay to my website?"} {"_id": "45445", "title": "Import email addresses to mail group using Plesk", "text": "My web hosting service uses Parallels Plesk Panel 9.5 as their Control Panel. I just created a mail group (like a distribution list, one email address to email multiple people). I can only add ONE email address at a time through the Control Panel, and I have hundreds of emails to upload. Is there a better way to do this?"} {"_id": "54049", "title": "Using article published dates effects SEO?", "text": "In my blog I publish articles with date below under it. My questions are: 1. Using dates will have positive impact or negative impact? 2. After few years of publishing the post, since my post age is old will it remain in same position of search results or it will go down? 3. Finally, should I use published dates or not?"} {"_id": "30439", "title": "What influences the loading of a web page?", "text": "What influences the loading of a web page? What does the web page load faster or slower?"} {"_id": "13077", "title": "what is user agent Whirlpool BD2; H010818", "text": "what is this user agent Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.0; Whirlpool BD2; H010818) this came from a ip address tracing to the organisation Websense the question also here thanks"} {"_id": "13078", "title": "Is it legal to show RSS feeds on my website from another website", "text": "I want to display the feeds from multiple news website on my website and create a site which would be something like http://paper.li So is it legal to display those feeds directly on my website? The site is not a feed reader, but I want to pick specific news article from the feed and display them."} {"_id": "30432", "title": "Migration from one domain to another - Transfering the social media stats", "text": "I am planning to move my site from one domain to another i.e from domain a.com to b.com . The site also has a lot of content and the migration of content is not an issue. The 301 redirect will take care of all the backlinks also. But my real worry is transfer the social media shares links and stats from domain a.com to b.com. I need some insight or any way in which the same can be migrated seamlessly from domain a.com to b.com"} {"_id": "30430", "title": "Re-indexing website with clean URL's", "text": "So I have a website with URL's like this: http://www.domain.com/profile.php?id=151 I've now cleaned them up with mod_rewrite into this: http://www.domain.com/profile/firstname-lastname/151 I've fetched and re-indexed my website after the change. What is the best way to make the old dirty ones disappear from search results and keep the clean ones? Is blocking profile.php with robots.txt enough?"} {"_id": "44221", "title": "How to access phpMyAdmin with Hosting Controller?", "text": "I am trying to update a site for a client. All the login info they have given me is for something called Hosting Controller. I've never seen it before but it seems like cPanel. I can create email accounts and databases etc. The only thing I cannot seem to do is access phpMyAdmin (or similar). Would anyone know how to do this?"} {"_id": "44220", "title": "How to avoid spammy links affecting us before SERP rankings going down", "text": "We have recently discovered a host of spammy links pointing to our website been put up in last 1 month through webmaster tool We have not engaged any seo agency/link building and have never created these links Worst, these are spammy forum sites and almost some-one seems making an attack to our site as the link and keyword is same in all these sites Till not Have not received any warning, however as it has come to our notice - before any SERP getting down - we want to action this out One thing we have observed is that - All these spammy links points to a single inner page of website Pl suggest as have a) Will Google disvow tool will work by stating to disvow these links or b) Is it a good way if we make this specific page 'nofollow' and even 'noindex' The reasoning behind this is like if the page is not followed and not indexed - the negative links pointing to the page of the site is already no follow and may be out from google serp or what's the best alternate way to tackle this. Pl suggest in urgent help and advise required"} {"_id": "44223", "title": "Can Google show some pages deleted six months back and blocked using Robot.txt?", "text": "If I have deleted some pages on my website six months back and blocked these pages using Robots.txt, can that non-existent page show up on Google?"} {"_id": "44226", "title": "Security: Hide mysql connection details file", "text": "Ok, so I know this must be a very basic question, but my problem is the following: I have developed my first PHP mini-app (so I'm fairly new to this) and I am pretty sure that there are two files of my web application that should be somehow protected from prying eyes * the one with the details to connect to the db * the one with the password hashing and the salt key How should I go about protecting them? I have read somewhere that they should be put somewhere above my public directory, but then how will the app be able to access them? Should I change something in my apache config so that the Document Root is different than something else? Any other things I should consider in terms of security? So far I am escaping every user field that goes into the db with mysql_real_escape_string(), but apart from that and hashing and salting the passwords, I'm not doing anything else."} {"_id": "36175", "title": "http request terminating early", "text": "I noticed that on some of my sites, images were occasionally not getting downloaded fully. After a bit of investigation it appears that it is not restricted to images - .css, .js etc were also occasionally terminating early. The faults appear to be random. When I use the debug/proxy tool Fiddler2 reports that fewer bytes have been received than were requested. Firebug reports \"Image corrupt or truncated\". Obviously this is mainly a concern between me and my hosting company. However despite many emails they have not been able to get to the bottom of it. Transfer to another hosting company is obviously an option but is really a last resort. Has anyone seen this kind of thing before or can anyone suggest what might be causing it. Or any apache setting or something that I can ask them to check out. Will apache log this kind of error - they havent been able to provide me with any logs, but if I know exactly where things have been logged, maybe I can prompt them in to action. **EDIT: More data as requested by commenters..** I created a test page. It is pure HTML and embeds 9 images each of size between 4 and 6MB. This is a very large amount of data. I found having so much data speeded up the occurrence of faults which makes benchmarking easier. As a control, I run the same test on the server of another hosting company that I have access to - I never get faults when running on this other host. It is on shared hosting. I have tested it on FF, IE & Chrome. **EDIT - DETAILS OF HEADERS ETC ...** (In case your wondering, this version of the test is not pure HTML but has a small amount of PHP added to to prevent caching.) LiveHeaders o/p... https://docs.google.com/file/d/0B93lksnTC7_cYVhQT2lMbms5Mm8/edit What you see in Firefox... https://docs.google.com/file/d/0B93lksnTC7_ceHBSbmhXWHhXdlk/edit What Firebug logs... https://docs.google.com/file/d/0B93lksnTC7_cOEdmeU1lMkh4UE0/edit If you have difficulty getting into google docs - alternative... https://skydrive.live.com/redir?resid=9F8DE44BC9AC7FD6!107&authkey=!AAXurFZleJE5okQ As you can see the images that failed were: DSC_0046_0.JPG DSC_0232_0.JPG **EDIT - UPDATE** I've noticed that it gets worse if I simultaneously run several copies of the test. (And running several copies of the test on the control host still passes.) Further suggesting that if the server gets too many requests it might be terminating some requests early or something."} {"_id": "36177", "title": "500 error on Joomla website", "text": "PHP Fatal error: Call to a member function setQuery() on a non-object in /home/josh/public_html/administrator/components/com_jfusion/plugins/phpbb3/forum.php on line 226 Just moved over to a new server. Anyone have ideas as to what is wrong? Is this a database issue? line 226: //get permissions for all forums in case more than one module/plugin is present with different settings $db = & JFusionFactory::getDatabase($this->getJname()); $query = \"SELECT forum_id FROM #__forums WHERE forum_type = 1 ORDER BY left_id\"; $db->setQuery($query); //226 $forumids = $db->loadResultArray();"} {"_id": "33528", "title": "How should I fix the problem of duplicate content created by my CMS system's manual page alias feature?", "text": "My site\u2019s CMS has a manual page alias feature which is in wide use. I recently noticed that the aliased pages and the original pages are showing up in GA with separate traffic. For example: www.example.com/subfolder/pagename.aspx www.example.com/pagename/ these two pages are serving the exact same content from the cms, so it is in effect duplicate content which is bad for SEO. From this extremely helpful page it seems the best option is to add a canonical link to the header. Can anyone verify that as the preferred solution? I would also like to know if this is a common problem among CMS platforms. The one we are using is called Ektron."} {"_id": "54198", "title": "Footer not showing in website depending on which item is loaded", "text": "I designed a website which is having an issue, but I checked the html tagging very well and cannot fix it. If you go to this item: http://www.tahara.es/store/headbands/11/Ivory- Turquoise-headband You will see the FOOTER display normally. However if you go to this other item: http://www.tahara.es/store/headscarves/15/Grey-and-ivory-with-stoned-flower- Headscarf The footer does not show. Any clue of what I am missing or adding? The footer DIV is like this: `
`"} {"_id": "18093", "title": "seo htaccess 301 redirect", "text": "Have an SEO question. I have been checking the progress of my site indexing in Google, for some bizarre reason best known to itself, there is a very random URL indexed against my domain which doesnt exist. http://www.footballadvisor.net/index.php?option=com_content&view=article&id=35:is- it-possible-to-change-the-types-of-menu-entries&catid=31:general&Itemid=46 Is there a way I can forward the url and querystring to www.footballadvisor.net as a 301 in my htacess file? Thanks :)"} {"_id": "59355", "title": "Google / SEO guideline about using a whole page overlay DIV and putting it in focus on Mouse Leave", "text": "Recently I stumbled into a startup news website yourstory.com who seem to be using a method to increase user engagement (increase page views) by capturing user's \"Mouse leaving\" event and on recognizing that event set focus to a NEW full page content DIV -- purpose seems to be able to draw user's attention back to \"Most Read\" kind of content-page and inviting them to click through to another page. Even though it seems like a fine way to me (from the User experience), what I am not sure is if such page design can be penalized by Google, as misleading (has hidden DIVs etc). I am hoping to get an opinion if the implementation is SEO-safe :: it seems jQuery is used to capture mouseleave event which when triggered sets focus to an overlay DIV that has \"Most Read\" kind of content- elements. Please advise."} {"_id": "18090", "title": "What platform can I use to establish a users support forum?", "text": "I was asked to develop a users support community forum for a small and young hi-tech company. This kind of forums is very popular with companies these days and is a great alternative to the official product support lines. An example of what we look for is the Analog Devices' Engineer Zone forums. I am totally noob to this area so any advice will be appreciated - how to start developing such a site, what platforms are available (preferably open- source/free) and what are the advantages and disadvantages of these platforms. Is a _stackexchange_ style forum appropriate for such site? * If you think there is a better SE site to post this question, please let me know."} {"_id": "33520", "title": "SEO in what element text should be placed?", "text": "I am making my web better for SEO and I came up with question: in what element text should be placed?

I know that I should use each of them in different situations but I dont know in which situations... **EDIT:** Example of 2 situations/doubts I encountered"} {"_id": "61762", "title": "Does Google crawl and index sites hosted on an IP address only (with no domain name) and non-standard port?", "text": "I have website on `203.162.177.159:8071` (Actually, that isn't the real IP address, I changed it for this question.) Because of an internal rule, this site must not have a domain name, and running on **non-standard port**. In Google Webmaster Tools, I see * no crawl errors found * no verification errors found * I tried \"Fetch as Google\" and they are all successful. I can then click the URL to get to the page myself. * I don't have the sitemap yet So why the crawled page and indexed page is still **zero**? Does Google not accept non-domain-name URLs?"} {"_id": "18767", "title": "Why don't people embed style and scripts in the HTML?", "text": "I know this sounds like a silly question, but hear me out. These days, there are a lot of 'media aggregator' tools that will take your globs of javascript (laden with coffee and underscores and backbones and who knows what else) and string them together and rewrite them and munge them and uglify and minify them them and tape them together so that clients _only have to make one request_ to get all the javascript or style sheets for a site. This is pretty cool. I like keeping my java(coffee)script files small and separate, and I like my clients making only one request. But if we go through all that trouble to make our javascript really small and stuff, why not just send it WITH the first request? Why not just have tools that do the minification and aggregation and then write it directly to the html? Common sense tells me that this would be bad, but I'm not sure why. Can anyone explain?"} {"_id": "18766", "title": "mod_rewrite prevent repeating backreference", "text": "I am trying to match a single reference in a URL to pass to a script, while retaining the rest as the path to the script. For example: #/scripts/paramater/foo/bar.php should call #/scripts/foo/bar.php?id=parameter RewriteRule ^scripts/([^/]+)/(.*)$ scripts/$2?id=$1 [L,QSA] However my backreference is matching multiple times. $2 only contains \"bar.php\" and $1 is somehow repeating for \"parameter\" and \"foo\". How can I prevent the backreference from repeating itself? **Edit:** I've narrowed down the problem somewhat by fiddling with the second part of the rewrite rule. #This doesn't work, as above #Outputs: scripts/bar.php?id=foo?id=parameter RewriteRule ^scripts/([^/]+)/(.*)$ scripts/$2?id=$1 [L,QSA] #This works #Outputs: foo/bar.php?id=parameter RewriteRule ^scripts/([^/]+)/(.*)$ $2?project_slug=$1 [L,QSA] Which confuses me even more! Any help appreciated"} {"_id": "20271", "title": "alternatives to google adsense for ajax reloads", "text": "What alternatives to google adsense are availalbe that support the reloading of ads through and AJAX application? I can't locate one."} {"_id": "20273", "title": "Transfer a national domain to foreign registrar?", "text": "Can you transfer a national domain (like .pl) to a foreign registrar, one that does not actually offer buying that type of domain?"} {"_id": "20274", "title": "Redirect to subdomain port 8080", "text": "I have my domain registered and running. I have installed lifray which is running on tomcat 7, my site url is iloveliferay.com:8080 and I want this to be at `themes.iloveliferay.com`, which file I need to change to make this possible? I have VM cloude hosting and I have root access."} {"_id": "27555", "title": "Is Google Analytics safe for websites that deal with sensitive information?", "text": "I work for a company that writes several webapps that deal with a lot of sensitive information, such as full name, date of birth, address, and SSNs. Currently we don't have anything to measure site usage, but I would like to use Google Analytics to track usage and statistics about our users. What data is sent to Google when you use Analytics? If I put this on a page that contains any of the above information, will that data be sent to Google? Or are they just getting the necessary information like user agent and IP address?"} {"_id": "58969", "title": "Subdomains as a part of the one site", "text": "Let's say that the site consists of some pages like: some_product.example.com some_service.example.com another_product.example.com ...so on Can I work with such a site using Google Webmasters Tools, Sitemaps as with normal sites made within one domain?"} {"_id": "27558", "title": "Do short posts hurt SEO?", "text": "I'm interested in making a custom post type in which every post is a single image. My question is, WordPress creates a page for each post, and since it won't have any words, will it hurt the page rank of the website? If this question isn't related to here, I'd love to know where i can ask this question."} {"_id": "64800", "title": "Rich snippet for Google Custom Search - Schema.org", "text": "I am trying to extract the book URL from a link using microdata. The format is specified in schema.org. Here is my html.

{{ book.name }} - {{ book.author }}

{{ book.about }}

When I use google snippet testing tool the JSON API returns book as a html link. However when I make the call in javascript the value of url is text(\"Read\"). What am i missing ?"} {"_id": "27809", "title": "Does Google counts backlinks from homepage to inside pages?", "text": "I have a site with good PR and my inside pages are getting increase of PR, but they don't have links pointing to them, only from my homepage. Does that means that Google counts ALL links on my homepage, including links to inside pages? Does it calculate inside pages PR with one coming from my domain, my homepage, too? Also, if inside pages that got high PR from homepage have link back to homepage, will that increase homepage PR additionally, since those links should count too? > By Google PR algorithm formula, by calculations on Wikipedia and Stanford PR > algorithm explanation ( which is originally developed by ) it counts those > links, and also it counts after-increase backlink again, making few times > circle ( it stops because of d ( 0.85 ) factor. ), but it counts them. Does anyone know is this correct?"} {"_id": "67884", "title": "Do noindex,follow pages pass pagerank / link juice?", "text": "Assuming one marks a page as 'noindex,follow', is PageRank / link juice passed to other pages nevertheless?"} {"_id": "10180", "title": "Implications on automatically \"open\" third party domain aliasing to one of my subdomains", "text": "I have a domain, let's call it `www.mydomain.com` where I have a portal with an active community of users. In this portal users cooperate in a wiki way to build some \"kind of software\". These software applications can then be run by accessing `\"public.mydomain.com/softwarename\"` I then want to let my users run these applications from their own subdomains. I know I can do that by automatically modifying the.htaccess file. This is not a problem. I want to let these users create dns aliases to let them access one specific subdomain. So if a user `\"pippo\"` that owns `\"www.pippo.com\"` wants to run software HelloWorld from his own subdomains he has to: 1. Register to my site 2. Create his own subdomain on his own site, run.pippo.com 3. From his DNS control panel, he creates a CNAME record `\"run.pippo.com\"` pointing to `\"public.mydomain.com\"` 4. He types in a browser `http://run.pippo.com/HelloWorld` When the software(that is physically run on my server) is called, first it checks that the originating domain is a trusted one. I don't do any other kind of check that restricts software execution. From a SEO perspective, I care about Google indexing of `www.mydomain.com` but I don't care about indexing of `public.mydomain.com` What are the possible security implications of doing this for my site? Do you know some existing website implementing a domain configuration like this one?"} {"_id": "28596", "title": "SEO effects of linking to subdomain", "text": "I have http://hollywoodnose.com as a main site. Then I have http://forum.hollywoodnose.com as the subdomain. I'm cautious to link to the subdomain from the main site. I did this before and droped 50% of traffic from the main site. Also, Google Webmaster Tools showed over 3000 incoming links from Hollywood Nose to the subdomain / forum. I'm assuming Google frowned on this and saw it as spam or something of that nature. All I would want is link to the subdomain in the top nav bar of the main site. Is this safe? Will I get an overabundance of incoming links resulting in an SEO drop? Does anyone have any experience in this matter? Thank you."} {"_id": "3544", "title": "Make your site anti-bot?", "text": "I remember a site closed due to misuse and i wonder if bots have a part of it. If the bot is POSTing something to my site what are ways i can combat it? I was thinking of setting some cookies and having the cookies changed via javascript + timestamp and sign (so yesterdays cookies cant be used today and next week). I'm sure most people/bots would just use another site instead of enabling JS in their bot. What else can i do? I'm thinking daily POST limit and a honeypot for generic bots who just randomly post spam"} {"_id": "16324", "title": "Need help for stopping spam inquiries", "text": "> **Possible Duplicate:** > Make your site anti-bot? I am not a web developer, nor have I the required knowledge to grasp this situation. I have a small company with a modest web site where I market properties. Recently, my website is being bombarded with spam inquires. Here are some examples: \"It was dark when I woke. This is a ray of sunhsnie.\" \"This is exactly what I was looknig for. Thanks for writing!\" \"What a joy to find sooemne else who thinks this way.\" As you can see, all of them has typos, probably for tracking the work done. I would be glad if someone can offer a solution to stop this nonsense. Thank you for sharing your time."} {"_id": "9785", "title": "Best way to block \"comment spam\" postings to web forms?", "text": "> **Possible Duplicate:** > Make your site anti-bot? I have a custom web form on my PHP-based site. Recently it is getting a regular stream of **comment-spam** postings from a few specific IP addresses. **Question:** What is a good way to block a small set of blacklisted IP addresses from accessing my site? I was thinking it should be possible using **.htaccess** to respond with status code 403 (Forbidden) for all HTTP requests from the blacklisted IP addresses, ... but I am not sure exactly how to do that. If anyone knows the **.htaccess** syntax needed to accomplish this, ... please let me know. thanks in advance,"} {"_id": "20636", "title": "What's a good way to keep spam out of 'anonymous' submission websites?", "text": "> **Possible Duplicate:** > Make your site anti-bot? I'm trying to do everything I can to prevent spammers from just pasting crap all over my forum websites. People get incentivized by video games and porn sites and who-all to just paste their ads all over the Internet. I feel like just locking it all up and requiring admin approval for every account. On the other hand, I don't even like registering for websites. A lot of times, I just go out on a 'do-gooder' mission, trying to hook somebody up with an answer to an extremely difficult question. I run into 3 or 4 forums I've never been to in my life..., just wanting to post on them cause they're at the top of google. I don't want to submit my information to them, though. I don't know or trust them, and I don't want to go through validation steps. So, the Internet just goes without the valuable information that I have, sometimes. I don't want people to just avoid my website because they aren't in the mood to hassle with sign-ups. Naturally, I've tried Captcha and validation e-mails, but spammers are generally smart enough to get by these. Are there some other good techniques?"} {"_id": "58833", "title": "Do you have to tell the user that they're viewing ads", "text": "I'm creating a feed that will show different subjects with Masonry. These \"boxes\" will be clickable and link to external websites. Now some of the \"boxes\" in Masonry will eventually be ads. I read somewhere that if you're showing ads on your website, you MUST display something like ---------- ---------- ---------- | | | | | | | Box | | Box | | Box | | | | | | | ---------- ---------- ---------- Advert Is this true? Because I would prefer to keep the normal boxes and advert boxes the exact same so the user doesn't know that they're clicking on ads. Something like: ---------- ---------- ---------- | | | | | | | Box | | Box | | Box | | | | | | | ---------- ---------- ---------- I'm not trying to con anyone, the adverts will obviously have relevant information, but I'd prefer to keep the user oblivious to the fact they just clicked on an ad link."} {"_id": "29299", "title": "Is Placing a form to video is a best practice or not", "text": "I have a product site, in which i have a page called Demo. To view this demo, i have placed a form to be filled so that i can contact them. The form contains basic questions such as Email ID, Name, Phone Number. So, customers can view the video only after filling the form. Is it a good practice to place a form or not."} {"_id": "29298", "title": "SEO for multiple product pages", "text": "I have a domain with 16products and URL structure for that pages are like www.domain.com/product1 www.domain.com/product2 www.domain.com/product3........................ Now my question how to promote my sites using link building because if i do daily 40 submissions for each product(suppose i do for 3 products) then total submissions for the day will be 120. I think this may become over promotion because indirectly we are promoting home page only. Do you think this will penalize my site or shall i continue doing. One more question is which format is better for domain i.e., www.domain.com/product1 (or) www.product1.com Please let me know.... Thanking U in Advance......................"} {"_id": "51848", "title": "Publicly displayed e-mail address: Is it safe from spam-bots when dynamically generated?", "text": "I am currently in the process of making myself an employment website; the entire site uses a client-side rendering engine to dynamically load views. Because of this, the source of the page returns only the template elements. I have my e-mail address written in plain text in one of the views that is loaded after initial page-load. Is this safe from most spam bots? Or at the very least, is it anything I should worry about? Please do not suggest adding an e-mail form or other techniques; I am not asking for alternatives, I am asking specifically about the security behind this method."} {"_id": "29293", "title": "How to add google doodle?", "text": "I want to know that for whom google adds doodles on its home page? Suppose if I want to add google doodle for my birthday on google homepage then is it possible to do so..? If it is possible then please let me know how to do so. I have gmail and google+ accounts."} {"_id": "29747", "title": "What version of IE does Compatibility View use by default?", "text": "My _understanding_ of IE Compatibility Mode/View is that you can specify which version of IE you want to render a domain (or set of domains) in. But when I go to `Tools >> Compatibility View Settings` it only allows me to add a bunch of sites to be displayed in Compatibility View. My question is, what version of IE is this Compatibility View using? If I add `example.com` and `blah.net` to Compatibility View, what version of IE will they be displayed in?"} {"_id": "47239", "title": "I want to redirect subdomain.domain1.com to domain2.com/directory using DNS", "text": "I own 2 domain names \"domain1.com\" and \"domain2.com\", I want to redirect subdomain.domain1.com to domain2.com/directory using DNS (preferably). Can anyone assist (I am an amateur at domain management)?"} {"_id": "47238", "title": "Create partial php.ini files", "text": "I want to two split php.ini into two files so that I can move important entries to a new file. I think it is a .ini file so, any rule for creating partial .ini files will also apply to it. Is it possible to do so? If yes, how?"} {"_id": "54375", "title": "'Buy the app' landing page implementations: redirect or JavaScript popup?", "text": "My site (using Django) has an app that I'm trying to push - I currently have a piece of middleware that redirects the user to a page advertising the app if they're accessing the page on the iPhone, then setting a cookie so that the user isn't bugged by the message every time they visit the site. This works fine, however checking the page with the mobile Googlebot checker shows that the Googlebot gets stuck in the redirect (since it doesn't store cookies) and therefore won't index the proper content. So, I'm trying to think of an alternative implementation that won't hurt the site's Google ranking and won't have any other adverse effects. I've considered a couple of options: * **Redirect (the current solution), but don't redirect if the user agent matches the Googlebot's UA string**. This would be ideal, however I'm not sure if Google like their bot being treated differently from other users, and I'm afraid the site's ranking may be somehow penalised if I go ahead with this. * **Use a JavaScript popup instead of a redirect**. This would make sure the Googlebot finds the content it needs, however I envision this approach causing compatibility issues with the myriad mobile devices/browsers out there, and may affect the page load time. How valid are these options? And is there a better option for implementing this feature out there? I've tried researching this topic but surprisingly can't find any reputable-looking blog posts that explore this topic."} {"_id": "1158", "title": "Cheap server stress testing", "text": "The IT department of the nonprofit organization I work for recently got a new virtual server running CentOS (with Apache and PHP 5), which is supposed to host our website. During the process of setting up the server I discovered that the slightest use of the new machine caused major performance problems (I couldn't extract tarballs without bringing it to a halt). After several weeks of casting about in the dark by tech support, it now appears to be working fine, but I'm still nervous about moving the main site there. I have no budget to work with (so no software or services that require money), although due to recent cut backs I have several older desktops that I could use if it helps. The site doesn't need to withstand massive amounts of traffic (it's a Drupal site just a few thousand visitors a day), but I would like to put it through a bit of it paces before moving the main site over. What are cheap tools that I can use to get a sense if the server can withstand even low levels of traffic? I'm not looking to test the site itself yet, just fundamental operation of the server."} {"_id": "68736", "title": "Canonical and hreflang implementation for international desktop and mobile site versions", "text": "I have an interesting case and I am lost in it. There are two versions of the site: desktop and mobile. And there are also international versions: English and Spanish. I'm stuck at implementation of canonical tags. Currently my setup has the following: English (default) desktop page has these: English Mobile page has these: Spanish Desktop version: Spanish Mobile version: But I somewhat feel that I messed the things... Three questions: **1.** Could you guys point me to what I did wrong and explain how to set it right? **2.** Since I'm redirecting from desktop version of the pages to mobile pages when mobile users access desktop the pages, do I also need to redirect desktop visitors to desktop page versions if they visit mobile pages? **3.** Since my pages already have alternate and hreflang references to point to international and mobile versions of the pages, if I add 5 languages of desktop versions and 5 languages of mobile versions to sitemaps, my sitemaps will get bulky. What are the pros and cons for referencing all page versions in sitemap and for include just general (English/Desktop) version in sitemap?"} {"_id": "58427", "title": "How to test the hosting before you will buy it", "text": "There are several services giving a possibility to test ping time to a server. But if I don't have any server on tested hosting yet - how do I test the ping to it? Can I ping hosting web-site it self?"} {"_id": "47233", "title": "What is the maximum size of an HTML file that Google will crawl through?", "text": "I have a 10MB HTML file. Will Google crawl through the entire file, or does it only look at the first X MB?"} {"_id": "23072", "title": "404s on password protected content", "text": "I'm new to WordPress and SEO, generally, but we've been running into problems with our site that don't seem to make sense to me. The problem is that our editor likes to schedule posts and/or mark them private until she is ready to make them public, but somehow Google is crawling these posts and getting 404s (because they are password protected). How does Google know they exist in the first place? I checked the sitemap.xml file and don't see a record of the post. One of the offending posts was marked public, but is scheduled for a future date. Could that have something to do with it? I've tried to Google the answer, and I came up with a good amount of reassurance that this won't hurt the site, but I'm still wondering how it's happening in the first place. It's hard because I don't know exactly what the editor's workflow is. Is it possible she's posting publicly first and then revising it to be private only after it's too late? Does anyone know how Google finds WordPress URLs it shouldn't have access to?"} {"_id": "1156", "title": "Are there open solutions for fraud detection?", "text": "I've been dealing with an inordinate amount of bogus payments, mostly people using PayPal. I am aware of services that provide an API that returns a score, but I'd like to first explore the possibility of using something open source. Are there any solutions (language matters not) that use freely available geolocation databases (I'm interested only in country level accuracy, and it does not have to be perfect) combined with other freely available sources of information that may help determine a fraud sale? (I.e. checking IP blacklists, etc). I could probably write something, but its not a wheel that I'm particularly interested in studying, much less re-inventing, unless I must."} {"_id": "47237", "title": "Does google take into account page load differences from caching?", "text": "Here's what I mean by that. If I use internal CSS, the page will load a bit quicker on the first time it is visited, but with external CSS a user can cache the document and load the rest of the site faster. Does Google take it on a page-by-page basis, ignoring caching, or will speeding up overall speeds throughout the site with one large CSS document be beneficial? My priority is SEO rather than user experience in this case, due to the fact that in this instance the user experience difference will be relatively minimal, and I want to crank the SEO as hard as possible. Thanks for the help!"} {"_id": "47936", "title": "Is Opera Version 12.15 using WebKit or Presto for its rendering engine?", "text": "I cannot tell from the Opera Desktop team blog or from the Presto Wikipedia page if Opera 12.15 is utilizing the WebKit or Presto rendering engine. The phrasing from Wikipedia can be interpreted either way: > It [Presto] remained in use until Opera 12.15, when the browser's developer > Opera Software ASA began phasing Presto out of its products in favor of the > WebKit layout engine and V8 JavaScript engine combined with a modified > Chromium browser. Is Opera Version 12.15 using WebKit or Presto?"} {"_id": "23074", "title": "Is reCaptcha now basically useless?", "text": "Is reCaptcha now basically useless? I have read around the internet that reCaptcha has been broken, and spam bots easily overcome it. Has Google addressed this at all. Do they have any plans to fix it? I cannot find this type of info on the web, and I was hopeing somebody may have some insight. Are there any similar, reliable, more secure webservices out there for this type of thing? I do not want to have to make my own captcha type class because on-the-fly image processing is fairly resource intensive, and I do not want to make a Q&A type captcha (I do not want this class to need DB access). Any suggestions or info about reCaptcha's problems are welcome."} {"_id": "59011", "title": "Similar article by the same Google Author", "text": "I have two websites which are focusing on two different things (both are timbers, but different type of timber). Let's say if I post in both sites an article (the article is about bush fire) which has exactly same content with exception of the last sentence, will one of the websites get penalized by Google for having duplicate content? What if both sites use the same Google Author? Will this make any difference?"} {"_id": "49426", "title": "How can Google crawl content accessed via filter check-boxes?", "text": "I built a website that is divided into many categories. Each page displays the main category, and within that there is a check-box filter (like Amazon and eBay have when searching for items). I want this filter to be indexed by crawlers so that my sub-categories will also appear in the search engines results. How should I build this filter?"} {"_id": "59014", "title": "Traits of successful sites", "text": "I learned hard way creating websites. Once it dawned on me that we should have social networking on our sites, then another day that a mobile version is required. That we should register with Google Analytics and Google Webmaster Tools. Analysis with AWstats is also crucial for visitor stats. Successful sites are most likely on VPS or higher hosting. Choosing right CMS is also important. Implied is good original content. **Without making further discoveries .. I'd like to know from pro-webmasters the common traits of all successful sites.** * * * **UPDATE:** I'm not asking about the content or the quality of it. I want to know common features/functions of successful sites."} {"_id": "15118", "title": "What exactly is a company like SoftLayer?", "text": "I found out hostgator has their servers hosted with softlayer. I wasn't aware there was another company involved. Can someone explain to me what's the role of a company like softlayer exactly and what they do vs. what hostgator does?"} {"_id": "30295", "title": "Potential issues with multiple home pages", "text": "I have a site where I want to have **two different home pages** : a general description page for anonymous users, and a dashboard page for logged-in users. I am debating between two implementations: * Both pages live at `/` * The page for anonymous users is located at `/` and the dashboard is at `/dashboard`, with automatic redirection between them based on whether a given user is logged in (e.g., if you're logged in and navigate to `/`, you are redirected to `/dashboard`. **Is it cleaner to have both pages use the same URL or separate URLs?** Also, I imagine that choices for that question will affect the following: * Caching: the anonymous page would be completely cached, while the logged-in page would not be cached at all (except for static resources). This could lead to issues with server caching, request speed, and UX (such as if one version of the page is cached in a user's browser when the other version should be displayed, instead). * SEO: how would search engines react to such canonical URLs? * Load time (due to redirects or to the server having to always reevaluate which page to display)"} {"_id": "15115", "title": "Open-source, PHP, DB-free SCM for brochure site?", "text": "A friend of mine is an architect starting her own business, and needs to build a basic brochure site to promote herself. The site will simply include a few articles and pictures of projects she worked on. There are so many CMS that it's hard to choose on, but the following are basic requirements: - Open-source - Mature, and with good support - In PHP, since just about any hoster supports PHP - DB-free, to make deployment really basic. If the SCM really does need a database for indexing, SQLite is OK - Good UI so she can easily add articles and photos to her site without having to know any HTML/CSS - Nice templates to choose from Thank you."} {"_id": "15116", "title": "How do I setup a permanent redirect from one domain to another?", "text": "I have a domain, say www.example.com which is redirected to a website built in Google Sites. I don't want different search engines treat example.com as a different site from www.example.com. How do I make a permanent link between the two? My domain is under GoDaddy. **EDIT** In Go Daddy, I have forwarding option set to Google site link. Under CNAME, I have Google provided alias and directions."} {"_id": "15113", "title": "Do users resize text?", "text": "I'm redesigning a website. For certain content areas the layout is fine at my text size but screws up if I set the text any bigger. I often resize pages with Firefox, but the whole page resizes so the layout still works. So, should I worry about users having larger text but the same CSS otherwise? I don't know how to test for this sort of thing. The site works fine with every browser I've looked at it with. I know some usability devices change layouts but don't they ignore normal styles altogether?"} {"_id": "58370", "title": "Is redirecting a subdomain to point to a folder good practice?", "text": "I have a blog which can be accessed at http://mysite.com/blog/ I would like to access it like this because I think it looks cleaner http://blog.mysite.com I did some reading on domains, subdomains, and redirects and have created a subdomain with the desired URL and re-directed it to the blog path. Is this a common set up? Is there a better way to accomplish what I want to do?"} {"_id": "29127", "title": "Cookie Audit help needed", "text": "Can anyone recommend any decent cookie audit plugins for firefox or chrome I can see the typical google cookies on the site but I am struggling to find out what a couple of the other cookies actually do What is the best way to find out what a cookie actually does?"} {"_id": "29126", "title": "Connecting blogger blog with subdomain.mysite.com, is it good for www.mysite.com SEO/pagerank?", "text": "I client of mine has a static website www.mysite.com and a blogger blog with the same subject mysiteblog.blogspot.com and wants to move the blog inside mysite.com to give her site a SEO boost. Will connecting blogspot with the subdomain blog.mysite.com affect page rank and search results for the main domain? How do the search engines crawl a domain witch is both static html (main site) and blogger structure (subdomain)?"} {"_id": "25899", "title": "How to manage multiple domains on same server", "text": "I have more than one (4 to be exact) domains and only one server host with same ip. How I can run all my domains from same server. All websites are wordpress sites."} {"_id": "29123", "title": "Apache on Windows - splitting vHost logs", "text": "I have a Windows Server 2008 running Apache, and it will be hosting several virtual hosts. I'd rather not use the logrotate tool (`|bin/logrotate`), as it seems create significant extra overhead with all the processes. Is there a simple Windows alternative to get the log entries from a combined log file split into several per-site files? Preferably with custom output directories, but that is optional."} {"_id": "25893", "title": "Huge difference between Facebook Ad Click figures and Apache log requests", "text": "We're running a facebook ad campaign for our business but there seems to be a huge discrepancy between the number of clicks registered and the number of requests made with \"facebook.com\" in the HTTP referrer. The difference can be anything between 40-80 clicks/requests. I understand why the Google Analytics would be off and I understand that the figures shouldnt be exactly the same but surely if 100 people click the ad then I should be seeing at least 90 requests for the homepage with facebook.com as the referrer? Can anybody provide any insight into why this may be happening?"} {"_id": "58541", "title": "Can I set Cache-Control in the .htaccess file for a specific part of a document (just a section of the )", "text": "I noticed, by using Google Page Speed Insights, that if I structure my HTML to load the critical, above-the-fold content first, my page will improve performance - and it woks (with now current perf. of Mobile 71/100 - Pc 87/100). My question is, with Apaches's mod_header activated, and with no content inside my `` tag, is it possible to set Data Expires for any part of my page (inside the body, eg)?"} {"_id": "39576", "title": "Payment from Paypal but no order in Magento 1.6.2", "text": "I just found a very strange problem and wondered if anyone has seen the same, or maybe knows a fix for this? We have paperwork for an order payment received through paypal, but the order doesn't exist when checking the Magento orders admin system? How is this possible? Cheers!"} {"_id": "25896", "title": "Tweet count just shot up", "text": "On our homepage we have a tweet button and counter: http://www.scirra.com This was around 600 until overnight it suddenly doubled to 1,200. It's been continuing to rise at a normal rate since. Has Twitter changed what counts as a Tweet for that counter? I've noticed competitors counts have dropped significantly. We don't buy tweets or followers, and I haven't found any spam tweets about us nor have we had any significant recent press."} {"_id": "39573", "title": "Allowing users in embargoed/sanctioned countries to register on my site", "text": "I run a website that contains technical content and users gain access to it by purchasing a subscription. I am based in the United States, and I'm wondering what the laws are regarding sanctioned visitors. Do I need to prevent access to certain countries, such as Iran, North Korea, etc. to comply with laws? Actually, this is a two part question: first, do I need to prevent them from even being able to register, without a subscription? Secondly, if they are able to register, can I legally accept payments from them? Thanks!"} {"_id": "29128", "title": "Managing multiple reverse proxies for one virtual host in apache2", "text": "I have many reverse proxies defined for my `js-host` VirtualHost, like so: **/etc/apache2/sites-available/js-host** ServerName js-host.example.com [...] ProxyPreserveHost On ProxyPass /serviceA http://192.168.100.50/ ProxyPassReverse /serviceA http://192.168.100.50/ ProxyPass /serviceB http://192.168.100.51/ ProxyPassReverse /serviceB http://192.168.100.51/ [...] ProxyPass /serviceZ http://192.168.100.75/ ProxyPassReverse /serviceZ http://192.168.100.75/ The js-host site is acting as shared config for all of the reverse proxies. _This works_ , but managing the proxies involves edits to the shared config, and an apache2 restart. Is there a way to manage individual proxies with `a2ensite` and `a2dissite` (or a better alternative)? My _main objective_ is to isolate each proxy config as a separate file, and manage it via commands. **First Attempt** I tried making separate files with their own VirtualHost entries for each service: **/etc/apache2/sites-available/js-host-serviceA** ServerName js-host.example.com [...] ProxyPass /serviceA http://192.168.100.50/ ProxyPassReverse /serviceA http://192.168.100.50/ **/etc/apache2/sites-available/js-host-serviceB** ServerName js-host.example.com [...] ProxyPass /serviceB http://192.168.100.51/ ProxyPassReverse /serviceB http://192.168.100.51/ The problem with this is apache2 loads the first VirtualHost for a particular ServerName, and ignores the rest. They aren't \"merged\" somehow as I'd hoped."} {"_id": "60434", "title": "Why does bot traffic have a high (e.g. 100%) bounce rate in GA?", "text": "Since bots may request many pages at a time, why do they so often have a 100% bounce rate? It seem unlikely that they would only request one page, then leave for over 30 minutes, then request another page, and so on Could use some help understanding their behavior"} {"_id": "31212", "title": "difference between accept and content-type http headers", "text": "So the accept header tells the server the mime-type of the resource the browser is looking for. For example, the server can send plain text, html, json, etc. Ok that makes sense. But then I look at the content-type header and it looks to be doing the same thing. For example, it tells the server that it wants text or json. So then what's the difference?"} {"_id": "31742", "title": "Magento Shopping Cart Rule Based on Grand Total?", "text": "I was wondering if anyone knew how to set up a shopping cart rule to apply only if the grand total of the cart is >= some certain value. I see that Magento allows you to set up rules based on the subtotal, but never on the grand total. Does anyone have an idea? Thanks. By the way, we are using Magento Enterprise edition version 1.10.1."} {"_id": "31215", "title": "Google search results are invalid", "text": "I'm writing a program that lets a user perform a Google search. When the result comes back, all of the links in the search results are links not to other sites but to Google, and if the user clicks on one, the page is fetched not from the other site but from Google. Can anyone explain how to fix this problem? My Google URL consists of this: http://google.com/search?q=gargle But this is what I get back when the user clicks on the Wikipedia search result, which was http://www.google.com/url?q=http://en.wikipedia.org/wiki/Gargling&sa=U&ei=_4vkT5y555Wh6gGBeOzECg&ved=0CBMQejAe&usg=AFQjeNHd1eRV8Xef3LGeH6AvGxt- AF-Yjw Gargling - Wikipedia, the free encyclopedia What's unclear to me is how a regular browser ever receives the wikipedia.org link to be able to put it in its address bar... The headers aren't too helpful either: \"Cache-Control\" = \"private, max-age=0\"; \"Content-Type\" = \"text/html; charset=UTF-8\"; Date = \"Fri, 22 Jun 2012 15:28:51 GMT\"; Expires = \"-1\"; Server = gws; \"Transfer-Encoding\" = Identity; \"X-Frame-Options\" = SAMEORIGIN; \"X-XSS-Protection\" = \"1; mode=block\"; Status code was 200, not redirect."} {"_id": "31740", "title": "Do or can robots cause considerable performance issues?", "text": "At work we are in a discussion with team members who seem to think bots will cause us problems relating to performance when running on our services website. Our setup: Lets say I have site www.mysite.co.uk this is a shop window to our online services which sit on www.mysiteonline.co.uk. When people search in Google for mysite they see mysiteonline.co.uk as well as mysite.co.uk. Cases against stopping bots crawling: * We don't store gb's of data publicly available on the web * Most friendly bots, if they were to cause issues would have done so already * In our instance the bots can't crawl the site because it requires username & password * Stopping bots with robot .txt causes an issue with seo (ref.1) * If it was a malicious bot, it would ignore robot.txt or meta tags anyway **Ref 1.** If we were to block mysiteonline.co.uk from having robots crawl this will affect seo rankings and make it inconvenient for users who actively search for mysite to find mysiteonline. Which we can prove is the case for a good portion of our users."} {"_id": "8034", "title": "Adult website crawlability issue", "text": "I got a mission and asking for some help with this scenario for a new website. The site is an adult escorts advertising service. It is developed fine, works and looks good, but the issue is with crawlability. Google webmaster tools index the same content for everypage. Let me explain further. If any client try to access any page the website show a default agreement/warning page. If client clicks \"proceed\" a session variable is set and client is redirected (302) to original page back. If client says \"no\", then he is redirected for another website (google, wikipedia, news portals...). Well, this is a safe process, but causes the crawlability issue I told. So, any lights upon this? How make the after agreement content be correctly crawled and indexed?"} {"_id": "8031", "title": "How to fix every link under new added domain?", "text": "I installed smf forum in subfolder on hosted server under my domain. Later on, I occupied new domain, added it to my nameservers and pointed to that folder thru cpanel. Index works well, link to forum works well, but deeper level links, boards, topics, profiles, login, etc, do not. Not that they don't work per se, but link is not under new domain, and thus user gets loged out when entering site thru new domain address, and gets back to old address once logged in, and similar problems. What should I do to fix all of the links? I'm guessing that this is because forum is installed before new domain is added on, and links are fixed to old domain. Is there another solution beside backing up and installing everything from scratch? Thanks."} {"_id": "8032", "title": "How can I buy a .by domain name", "text": "I can't find anywhere on the net to buy them."} {"_id": "64799", "title": "Google not indexing new forum", "text": "We installed a new forum a few months ago now. The URL is: https://www.scirra.com/forum I've 301'd the old topics/threads, as well as included all the new URLs in the sitemap. Yet they still are not appearing. Webmaster tools is showing: 139,512 URLs submitted 50,544 URLs indexed And has been stuck there for quite some time. A massive drop in indexed pages since we updated the forum as well: ![enter image description here](http://i.stack.imgur.com/Q1LuR.png) Any help much appreciated"} {"_id": "60435", "title": "Url returns error code for Adwords, but works fine in a browser", "text": "According to Adwords, my ad has a bad url that returns an error code, but I can load the url in a browser without any problems or error codes. I've fetched the url as Google from webmaster tools, no error code either. The actual adwords report states: > [Destination URL] Invalid HTTP Response Code: To ensure a good user > experience, your destination URL must work properly and not return an error > code beginning with a 4 or a 5 (such as a 404 error). Your destination URL > or some component on that page (like Javascript) is returning a status code > indicating an error. Here's the actual url that is problematic - as you can see it gets loaded fine without any errors. Can you recommend how should I troubleshoot this problem?"} {"_id": "42832", "title": "Best practice for Google Analytics on a mobile site", "text": "I have a new mobile site which sites on `m.`, `m.example.com` What is the best method for integrating it's Google Analytics data with the existing data I have. I want them together so I can easily see total visits for example. But then would like to segment them again to see how much traffic is going to each domain. Is this possible, if so how? Thanks"} {"_id": "8529", "title": "Use virtual pageviews for all goal tracking", "text": "I'm new to Google Analytics and I'm wondering if it would be cleaner to user virtual pageviews for all the goal tracking on my website instead of using a mix of regular page views and virtual pageviews. I know in most cases this is just semantics but there are multiple pages where the same goal can be achieved and I think it would be cleaner just to fire the same virtual pageview instead of having two different goal pages. Will this model also give developers more flexibility when they do development? I know we are moving to a CMS and urls can get hairy, so I think this might be a good way to make analytics portion of the site \"future proof\". Any thoughts are appreciated! Thanks."} {"_id": "12588", "title": "How to setup mod_rewrite/htaccess to do url masking and forwarding to a subdirectory?", "text": "Its been about three years since I've played with apache and php(I've been using thin and nignx ;). So I've forgotten how to setup a mod_rewrite directive to forward all http requests from root to the application's installed folder. Current setup and restrictions: * The site is on a shared cpanel hosting service that doesn't allow applications being installed as root (feature not a bug in my book for versioning and what not). * CMS application was installed under joomla-1.5.60/ * I need to setup /(.*) to redirect to /joomla-1.5.60/$1 but still look to the browser as /(.*) * RedirectMatch does not work * I cannot establish docroot as the application's root because of account restrictions I've read over the docs by apache and I keep getting a redirection loop. So any help you guys can provide would be appreciated. Thanks, D."} {"_id": "42833", "title": "Url blocked by robots.txt", "text": "I badly stuck on error in webmaster tools. I have created Sitemap and when I try to test it on webmaster tools it showing me this below error: Issue : Url blocked by robots.txt Description : Sitemap contains urls which are blocked by robots.txt However when I removed robots.txt file from server still it was showing me this above error. I have no idea how I can get rid from it. Thanks"} {"_id": "50486", "title": "Website blog, within website or separate?", "text": "I run a web development firm and a hosting company and we are launching new blogs for each company. Should we launch our blog within our sites (like `domain.com/blog`) or should it be a stand-alone site/blog (like `siteblog.com` with its own design design of the site, but as if it were a separate site? Why would you go with either over the other?"} {"_id": "8521", "title": "Do search engines use the id's and classes of HTML elements as clues?", "text": "This is more for interest sake than anything else, since I assume it will make very little difference, but I was wondering whether there is any evidence that search engines (Google, Yahoo!, Bing, for example) use the class names and id's of HTML elements as clues to the content? Would it make any difference, say, to change `id=\"left_column\"` to `id=\"news_column\"` ?"} {"_id": "8520", "title": "When linking pages, does google give higher value to href=\"http://www.mydomain.com\" than href=\"/\"?", "text": "When linking pages does Google give higher value to: ` ` Vs. ` `? I am wondering because I have heard that (for SEO) it is better to provide the full HTTP address when linking to the sites homepage. As I thought about it, I wondered why that would be. Would Google really assign higher value to ` ` instead of ` `? They both mean the same thing. Do you guys know or have any resources on this?"} {"_id": "12854", "title": "How do i check broken links and images in my website?", "text": "I have recently moved my website from one server to another server. for my confirmation i would like to check the dead links and the missing images on my website. like: if my website is http://outsourcingnepal.com I would like to crawl every pages on the website and find the missing images and non working dead links."} {"_id": "8525", "title": "How to open the JavaScript console in different browsers?", "text": "# _Updated on October 7th 2012_ * * * # Chrome: 1. Press either `CTRL` + `SHIFT` + `J` to open the \"Console\" tab of the **Developer Tools**. Alternative method: 1. Press either `CTRL` + `SHIFT` + `I` or `F12` to open the **Developer Tools**. 2. Press `ESC` (or click on \"Show console\" in the bottom right corner) to slide the console up. _Note: In Chrome's dev tools, there is a \"Console\" tab. However, a smaller \"slide-up\" console can be opened while any of the other tabs is active._ * * * # Safari: 1. Press `CTRL` + `ALT` + `I` to open the **Web Inspector**. 2. See Chrome's step 2. (Chrome and Safari have pretty much identical dev tools.) _Note: Step 1 only works if the \"Show Develop menu in menu bar\" check box in the Advanced tab of the Preferences menu is checked!_ * * * # IE9: 1. Press `F12` to open the developer tools. 2. Click the \"Console\" tab. * * * # Firefox: 1. Press `CTRL` + `SHIFT` + `K` to open the **Web console** (`COMMAND` + `SHIFT` + `K` on Macs). or, if Firebug is installed (recommended): 1. Press `F12` to open **Firebug**. 2. Click on the \"Console\" tab. * * * # Opera: 1. Press `CTRL` + `SHIFT` + `I` to open **Dragonfly**. 2. Click on the \"Console\" tab."} {"_id": "12584", "title": "Should I bother with a Business.com Listing?", "text": "I keep getting promotion codes from business.com stating that I can get a listing at $100 discount. However, that's still $199 and I'm not clear about the benefit. I would understand if I was paying for somewhere I could put I detailed profile (similar to what I can do with LinkedIn, CrunchBase or even Facebook for free) but it's just a text link. I can't even find a single suitable category to use so I doubt potential customers would be able to find the link either. I would need to pay for multiple listings to get into all the categories that may be useful. However, I could do that with their PPC program and given the estimated clicks they tell me I will get it appears to be more economical. In reality I believe that most people wouldn't even bother browsing through a directory hierarchy to find a suitable business these days. They would just use Google... and that seems to be the crux of their marketing message... The emails they send me have quotes from Aron Wall saying that he is \"a BIG buyer of [business.com] directory listings\" and that \"It is part of [his agencies] SEO process for the sites [they] care about most\". It appears to me that I wouldn't really be buying a listing that was useful in itself, I would actually be purchasing a paid backlink from a reputable site in the theory that it would pass reputation on to my site in Google's eyes... Does anyone have any **recent** experience with business.com listings and know anything about their current value (SEO or otherwise)? I thought that relevancy was more important for linking these days and the PageRank of the source page wasn't as important anymore. Also, isn't 'paying for a backlink' just 'paying for a backlink', regardless about how it's sold? If the value of a business.com listing just relates to reputation that theoretically anyone can purchase then why should Google place much value that reputation?"} {"_id": "26602", "title": "Can I host my PHP app in such a way that users can verify the source code being used by the server?", "text": "I have worked with one other person created a mobile-optimized web proxy to the League of Legends Tribunal (a peer review system). Riot Games, the creators of LoL, made a really nice web interface that happens to work very poorly on mobile browsers, so I set out to fix it myself. Due to constraints of the \"API\" available to us, (basic reverse engineering of the REST traffic in the original) **our app unfortunately requires users to enter their username and password in order to use it**. Having users share their passwords with third-party apps is unquestionably bad practice, but that's not a solvable problem at the moment. As a first step to mitigate the problem, we have open-sourced our app on Github for those that have the know-how to host it for themselves after verifying the code is safe. However, there are plenty of people that could benefit from our app that can't be expected to host their own trusted version of the app. This brings me to my question: **Is there a way that I can host this app in a way that someone could verify that it uses the same trustworthy code found on our Github repo?** I'm not concerned with building a solution that would automatically compare it to the latest code on Github or anything. Simply allowing them to browse the code being used would be enough, but I can't think of a way that would be trustworthy and could be spoofed by a crafty mind. Any ideas?"} {"_id": "63499", "title": "Is 100-150 words content enough for a web page?", "text": "I have some 20 different combinations of canonical pages featuring images in different categories. Is 100-150 words content okay for a page considering I use rel=canonical on their copies? I mean how's Google gonna take this? Does this affect the rankings? What are the othEr options?"} {"_id": "37794", "title": "Rewrite catalog/index.php to www.domain.co.uk", "text": "We have an online shop with a welcome page. Our SEO company has asked us to get rid of the welcome page with a rewrite. I am struggling to get this to work. Can you help please? I want to rewrite `www.domain.co.uk/catalog/index.php` to become `www.domain.co.uk` using a 301 redirect."} {"_id": "26607", "title": "Can I Use A Canonical Tag Instead of a Redirect for Updated Content?", "text": "I have some old articles on my blog that get quite a bit of traffic, but are very outdated. I want to remove them from Google's index using the `noindex` tag, but I'm not sure what the best approach will be to send the same traffic to my new article on the subject without using a redirect (as I want to keep them in my blog archives). I was intending to just put a link at the top of the article pointing to the new one, but was wondering if it was appropriate to use a `canonical` tag instead; the new article is on the same subject but doesn't contain the same content, so isn't really a copy."} {"_id": "50474", "title": "How to prevent Alexa and Google from indexing our stuff?", "text": "My traffic is not from search engine and I want to prevent bots or others from displaying or accessing it. What would be a good _.htaccess_ code to do so? I've heard User-agent: ia_archiver Disallow: / Don't really remove Alexa ranking because Alexa got the content from it's toolbar. Okay. What is then?"} {"_id": "52728", "title": "Should I expect Search Engines (particularly Google) to crawl epub and mobi files", "text": "Some of our content we are publishing as epub and mobi eBook format files. Will search engines crawl and index these in the same way as PDFs currently are. If the answer is yes, could you point me to some sample results page that shows how they appear."} {"_id": "54064", "title": "Adding more than one Google Analytics tracking code to a site - is this ok?", "text": "We have several sites that would like to aggregate all the analytics details, along with the individual site details. AFAIK, there isn't a way to do this from the GA interface. Is it alright to add another GA code to each site to allow us to track all details from one GA property?"} {"_id": "35918", "title": "How to get search engines to properly index an ajax driven search page", "text": "I have an ajax-driven search page that will allow users to search through a large collection of records. Each search result points to index.php?id=xyz (where xyz is the id of the record). The initial view does not have any records listed, and there is no interface that allows you to browse through all records. You can only conduct a search. How do I build the page so that spiders can crawl each record? Or is there another way (outside of this specific search page) that will allow me to point spiders to a list of all records. FYI, the collection is rather large, so dumping links to every record in a single request is not a workable solution. Outputting the records must be done in multiple requests. Each record can be viewed via a single page (eg \"record.php?id=xyz\"). I would like all the records indexed without anything indexed from the sitemap that shows where the records exist, for example: Record 1 Record 2 Record 3 next Assuming this is the correct approach, I have these questions: 1. How would the search engines find the crawl page? 2. Is it possible to prevent the search engines from indexing the words \"Record 1\", etc. and \"next\"? Can I output only the links? Or maybe something like:"} {"_id": "43956", "title": "SEO: Monthly changing homepage, better to redirect or duplicate?", "text": "I have a website called What can I plant now which is a simple site with a page for each month of the year and a homepage which mirrors the content for the current month. **Example** `http://whatcaniplantnow.com` shows the content for the current month which is currently exactly the same as `http://whatcaniplantnow.com/february` This will change from month to month. Obviously most people will - as the site's title alludes - be looking to find out what they can plant at that time, hence wanting to answer that question on the homepage - it's almost a one page site. So, from an SEO perspective, would I be best off with my current approach, where the homepage content changes once a month and always duplicates another page, or effectively not having a homepage and instead having `http://whatcaniplantnow.com` redirect to `http://whatcaniplantnow.com/february` or whatever the current month is. Or is there a better option? **What is the best way of presenting the homepage content and achieving good search engine rankings?** Cheers, Pete"} {"_id": "50323", "title": "sku code as description in Google Analytics", "text": "In the Google Analytics ecommerce tracing script you must provide for every item and SKU code. I have this code for every product I'm selling and up until now I have always provided it in the `_addItem` method. But when reviewing that data in the ecommerce module of Google Analytics, I have no real, no readable data about my SKU sales. I know what product has been sold, due to the product name I provide. But when clicking through to the SKU-level, I know nothing more, since all I can see there are SKU codes. Is it possible and wise to replace the SKU code with the following template? \"product-name colour-name size-name\" This way, it should still be a unique field, but more readable afterwards."} {"_id": "57908", "title": "Old and new sites running at the same time", "text": "I have a possible SEO issue at hand and I would really appreciate your advise. My client has an online store (`www.example.co.uk`), which has been around for 15 years - it's never seen any design changes, it's never been specifically optimized for performance or search engines. Frankly, it's a bad site, but it still attracts a good number of customers, due to the niche products it sells. Here comes the issue - I've created a new online store (based on the OpenCart platform), with contemporary look, and features, and SEO optimization (to the best of my belief). Unfortunately, the new site's content is really identical to the old site's one - it's got same domain name with a different suffix - `www.example.eu`, same items, only slightly different info pages and product descriptions. My client is really sentimental with the old web-site and specifically wants the two sites - the old and the new - running together until the new site starts getting some traffic, after which point we can completely re-direct the old to the new site. Three days ago, we put a banner (with a hyperlink to the new site built-in) at the bottom of the old website. On average, there would be 3-5 orders per day at this time of the year, but in the past 3 days, there've been none. It may really be a coincidence, but I'm really, really worried there could be something wrong with the SEO. Before Christmas, while the new site was simply \"out there\", it attracted its first order, coming from Google, without specific effort on our side. Now, the old site hasn't seen an order since we put a link to the new site. Again, the old site's main asset is its age, because from a SEO point of view, it's never been optimized, whereas, the new one should be now. I'm aware the optimal scenario is to re-direct pages from old to new site, but it's not what the client would accept. For what it's worth, looking at the Google Analytics for the old site, this week it appears to attract the same number of daily visitors as last week (around 50 / day). Are we in SEO trouble and how can it be avoided? Is it just a coincidence no new orders have come through? Would it be best to take off the banner linking to the new one immediately? * * * Situation has come to the point where client most definitely wants the old site running with a banner linking to the new site - as hard as I tried to convince them, this is their team's final decision. Having said that, would there be an appropriate way to minimize any SEO damage (or can it be avoided)? Would there be some combination of de-indexing the new site from Google Webmaster Tools, blocking engines with a robots.txt and adding certain on-page tags that would work? I'm okay with the new site NOT ranking well (if at all), as long as the old site maintains its current performance."} {"_id": "49306", "title": "Why did the Googlebot add \"sp=1\" to my URL's?", "text": "Googlebot crawled my site for the first time today, and added something strange. It appended `&sp=1` to my URL's. It seems like it's trying to gain authorized access? Normally I have `http://domain.com/?lang=en` and `http://domain.com/en` What does this URL mean: `http://domain.com/?lang=en&sp=1`, why is `&sp=1` added to it? The results with `&sp=1` are the same as the root page."} {"_id": "47387", "title": "Copyright infringement on an image from an article of mine", "text": "I just received a notification from my hosting provider which told me that a certain company sent a Digital Millennium Copyright Act (\"DMCA\") copyright infringement notice regarding an image found in an article from my site. The image is a print ad and it was meant to offer them exposure. The article itself is a collection of print ads. The hosting provider removed the image from the server and I removed the link from the article. Do I have to worry about other legal issues if the image has been removed?"} {"_id": "61781", "title": "SEO penalty for 301 redirecting googlebot only?", "text": "I have a website that is splitting into two. For the sake of easily describing the what and why, let's say I run a classifieds site and I want to move all the adult content to its own domain. (very far from the actual situation) What I think is best for my users is if I do the following: For the first 60 days: * All of the adult sub-urls load and show a \"Moved to `adultclassifieds.com`, this URL will be disabled in 60 days\" message with a link to the new location and no other content. Throw in a `rel=\"canonical\"` in case Googlebot finds the page anyways. * When Googlebot loads an adult sub-URL, they get a 301 redirect to the new site. After 60 days: * All traffic gets a 301 to the new site Has anyone tried this, or heard of this? I want to make the transition as smooth as possible for the users, but I also don't want mess up the SEO rankings."} {"_id": "43487", "title": "How many pageviews can my server handle per day?", "text": "I'm sharing some space on a dedicated server, but I'm not sure of the exact specifications. It is very fast though, and the person who I'm sharing the server with say they spend about 1400 a month on it and they are the same servers used by the defense department. There site is a small business site that maybe has 150 pages views per day. My site has none at the moment but I was wondering how many it could take. My site is simple, and runs 1 to 10 queries per page depending on the page. It has almost no images, except for the logo. What's an estimation for the amount og page views could I expect to get per day at a full server load. Thank you. Sorry for being vague, but that's really all the information I know."} {"_id": "61039", "title": "How do you get your site discovered by users in China?", "text": "I have a website in English and no translation available on the site. I would like to get the site indexed by Chinese search Engine. There are ways in which you can submit your site to Chinese search engines My question is that if my site is in English and even though its gets submitted, will the crawler of Chinese search engine translate into its native language ? If users in China search for some terms in Chinese, will my site ever show up ?"} {"_id": "65069", "title": "Changing the Preferred Domain in Google Webmaster Tools resulted in zero stats", "text": "I set my Preferred Domain for my website with `www` in Google Webmaster Tools, then I added the following code to my _.htaccess_ file: RewriteEngine On RewriteCond %{HTTP_HOST} ^example.com$ RewriteRule (.*) http://www.example.com/$1 [R=301,L] Then I added the `non-www` version of my website to Google Webmaster Tools, but all the charts went down to zero in that version. However, changing back to `www` version, I saw a small decrease in impressions, etc... My question is if that's normal for my website? If so, then what do I have to do with the `non-www` version in my Google Webmaster Tools account that doesn't have any indexed pages, sitemaps, etc...?"} {"_id": "43952", "title": "How DDNS can link to 3G simcard dynamic IP", "text": "I have CCTV that work via 3G (use simcard). I can access it directly using dynamic IP but as it keep changes, I won't be able to access it. My question if I set DDNS on CCTV, how can DDNS detect the current dynamic IP. Where do I port forward the DDNS host name as there is no router use. scenario like this CCTV(3G simcard) ------> 3G provider ------> user <------ <------"} {"_id": "61789", "title": "What do you call the dots that show the number of images on the home page of a site and let you switch between them?", "text": "I've seen this before on a lot of websites and I really like it. I would really like to put it on my website, what is this slideshow looking thing called and how do you do this? Circled in red. ![enter image description here](http://i.stack.imgur.com/k0Yr6.jpg)"} {"_id": "29322", "title": "Do alternatives to Google AdSense bring better revenues all other things being equal?", "text": "Assuming the same number of ads on a page and the same traffic, do alternatives to Google AdSense bring more (less) revenues? Does anyone have experience to share?"} {"_id": "26807", "title": "Help! Google analytics removed and don't know why", "text": "> **Possible Duplicate:** > My site disappeared from Google search, how long does it take to get back? I got this message: Google Analytics web property: link has been removed from http://aglassmenagerie.net etc. etc. I don't know why or what I should do to fix this. I have been trying to add stuff to my site on a regular basis. How do I find out what the problem is and how to fix it? I am a stained glass artist and have an online store. I am not that knowledgeable about this stuff. Someone please look at my site or code and tell me what I did wrong. Pam"} {"_id": "61032", "title": "Redirecting visitors to a rel=alternate page - permanent or temporary?", "text": "We are rebuilding our travel site to have both en-US and en versions, storing the language/region code in the url as a subfolder. Eg: www.domain.com/africa/botswana (hreflang=\"x-default\") www.domain.com/en/africa/botswana (hreflang=\"en\") www.domain.com/en-US/africa/botswana (hreflang=\"en-US\") All pages will be available in all three versions. A visitor landing on the x-default url will have their origin detected via geoip and be redirected to one of the language/region-specific urls. We plan on offering a site selector (flag dropdown) on each page to let a visitor override the detected origin - this will drop a cookie to remember the preference. When a visitor does land on a non-specific url and is redirected, should this be a permanent (301) or temporary (302) redirect?"} {"_id": "60499", "title": "Physical memory usage issues with WordPress plugins", "text": "Occasionally, my website that's hosted with GoDaddy is crashing due to physical memory overload. My host told me that PHP processes are overloading the server, and to try to find out which WordPress plugin is responsible for this, but when I run `top`, I only get this: PID USER PR NI VIRT RES SHR S %CPU %MEM TIME+ COMMAND 9204 thisusr 38 18 287m 68m 40m S 0.0 0.2 0:43.58 php 11175 thisusr 38 18 286m 66m 40m S 0.0 0.2 0:37.86 php 13536 thisusr 38 18 279m 60m 40m S 0.0 0.2 0:21.61 php 14091 thisusr 38 18 284m 66m 40m S 0.0 0.2 0:18.29 php 14285 thisusr 20 0 136m 1456 668 S 0.0 0.0 0:00.02 pure-ftpd 15101 thisusr 20 0 135m 1500 704 S 0.0 0.0 0:00.04 pure-ftpd 17461 thisusr 20 0 98956 15m 2712 S 0.0 0.0 0:00.00 cpsrvd-ssl 17466 thisusr 20 0 99092 15m 2712 S 0.0 0.0 0:00.00 cpsrvd-ssl 17745 thisusr 20 0 14908 1132 944 R 0.0 0.0 0:00.00 top 18979 thisusr 20 0 104m 1984 1004 S 0.0 0.0 0:00.35 sshd 18983 thisusr 20 0 11508 1332 1100 S 0.0 0.0 0:00.04 bash How do I find which WordPress plugin maps to each process with \"php\" as the command?"} {"_id": "22388", "title": "How can I secretly ban someone (hellban) in phpbb?", "text": "The trolls are driving me crazy; I'd like to experiment with a secret ban. > A **hellbanned** user is invisible to all other users, but crucially, not > himself. From their perspective, they are participating normally in the > community but _nobody ever responds to them_. They can no longer disrupt the > community because they are effectively a ghost. It's a clever way of > enforcing the \"don't feed the troll\" rule in the community. When nothing > they post ever gets a response, a hellbanned user is likely to get bored or > frustrated and leave. Source: http://www.codinghorror.com/blog/2011/06/suspension-ban-or- hellban.html It looks like there's no way to do this out-of-the-box with phpbb. Is there a way to hack it in?"} {"_id": "43950", "title": "How can you tell how much BANDWIDTH RSS Feeds are taking up, overall?", "text": "I am considering pulling out **RSS FEEDS** from the default Outlook Mailbox configuration pushed to our users by Exchange. My two biggest reasons for doing this is that (1) I firmly believe that the majority of my users have NO CLUE what it is or how to use it and (2) those that DO know what it is may be having a negative impact on **BANDWIDTH** by the use of the RSS Feeds. I would like to know if there is a _SIMPLE_ way to tell how much bandwidth the RSS Feeds (combined) are using? I know that it comes in over port 80, same as web traffic, which means that I could put a sniffer on the port then run the traffic through an analyzer...but note that I said SIMPLE. Thanks~~"} {"_id": "52839", "title": "Should I start with less competitive keywords as a strategy?", "text": "The keywords I have been using are very competitive and this is resulting in terrible rankings. Would it be an acceptable strategy to start with some less competitive keywords so that my website gets better ranking, more attention which will support more natural links to it? Would it then be acceptable to adjust the keywords gradually in the direction of the ultimate keywords and would this be a doable task?"} {"_id": "52836", "title": "Google Analytics Site Speed - iOS not being sampled", "text": "I'm looking at a breakdown of site speed by operating system in Google Analytics to get a sense of mobile vs desktop load times. The problem is that it looks like GA doesn't really collect speed sample data from iOS? Is this a known limitation with GA or something unique to this site? See image: ![enter image description here](http://i.stack.imgur.com/LV4wz.jpg)"} {"_id": "43483", "title": "How to know in apache logs if the application has crashed?", "text": "I need some help in finding some log pattern in apache logs related to application cash? Is there a specific key word attached related to this event e.g 'crash'. Thanks."} {"_id": "52833", "title": "Can I use imgur as a CDN?", "text": "I have an e-commerce site and I want to use imgur.com as a CDN for my product images by setting the image source of my image. I did not see anything in the ToS specifically about this. Can anyone offer any insight on this topic?"} {"_id": "52832", "title": "Are the websites hosted on Github restricted to programming?", "text": "I was searching about static hosting and I found blog posts talking about hosting their blog on Github, the hosting feature of Github seems pretty good, what I want to know is what type of websites can we host on Github pages ? should all the hosted websites be related somehow to programming (open source) or personal Github users pages (CV, Blog, Repositories). I'm aware that we can host only static websites and the repository will be public, I searched the terms of service and searched on the Github Pages page and I can't find an answer."} {"_id": "60928", "title": "Hijacked meta description", "text": "On my site: `faithhopeandfiction` someone hacked into my website and now it's displaying wrong Meta description on Google search results. I've verified this using \"view source\" that meta description and keywords are correct but whenever I search for \"faithhopeandfiction\" on Google or Bing it shows wrong description. What gives? EDIT: I know it's been compromised because someone put JavaScript code in my index page which redirected a visitor to a shoe store whenever someone hit my site (I've already removed that script). However, the description in Google still shows up incorrect and it's been like this for the past month."} {"_id": "68527", "title": "Postfix cannot send, getting errors", "text": "After installing postfix, there seems to be a problem when I try to send a mail through wordpress. After looking at the logs, this is what I found. Aug 27 12:00:01 tehnika postfix/cleanup[2797]: warning: database /etc/postfix/virtual.db is older than source file /etc/postfix/virtual Aug 27 12:00:01 tehnika postfix/pickup[2686]: 5E9BA81BF4: uid=109 from= Aug 27 12:00:01 tehnika postfix/trivial-rewrite[2798]: warning: database /etc/postfix/virtual.db is older than source file /etc/postfix/virtual Aug 27 12:00:01 tehnika postfix/cleanup[2797]: 5E9BA81BF4: message-id=<20140827160001.5E9BA81BF4@tehnika.mk> Aug 27 12:00:01 tehnika postfix/qmgr[1994]: 5E9BA81BF4: from=, size=682, nrcpt=1 (queue active) Aug 27 12:00:02 tehnika postfix/smtp[2799]: 5E9BA81BF4: to=, orig_to=, relay=none, delay=0.74, delays=0.01/0/0.72/0, dsn=5.4.4, status=bounced (Host or domain name not found. Name service error for name=mail.tehnika.mk type=AAAA: Host not found) Aug 27 12:00:02 tehnika postfix/cleanup[2797]: 1E3D481BF5: message-id=<20140827160002.1E3D481BF5@tehnika.mk> Aug 27 12:00:02 tehnika postfix/qmgr[1994]: 1E3D481BF5: from=<>, size=2582, nrcpt=1 (queue active) Aug 27 12:00:02 tehnika postfix/bounce[2800]: 5E9BA81BF4: sender non-delivery notification: 1E3D481BF5 Aug 27 12:00:02 tehnika postfix/qmgr[1994]: 5E9BA81BF4: removed Aug 27 12:00:02 tehnika postfix/smtp[2799]: 1E3D481BF5: to=, relay=none, delay=0.33, delays=0/0/0.33/0, dsn=5.4.4, status=bounced (Host or domain name not found. Name service error for name=mail.tehnika.mk type=AAAA: Host not found) Aug 27 12:00:02 tehnika postfix/qmgr[1994]: 1E3D481BF5: removed"} {"_id": "13988", "title": "Is unique IP address a must for SSL?", "text": "The shared hosting company told me that if I buy an IP address then it will be applicable for the addon domains too. What does it mean? Then the SSL certificate won't work since there will be more than 2 domains in one IP (if I add an addon domain)? If the SSL certificate works with one IP where is hosted two different domains then why it does not work on a shared host?"} {"_id": "60589", "title": "Domain bidding. The seller of the domain has failed to respond", "text": "I was trying to place a bid on domain trough sedo.com and got such response: > We regret to inform you that the seller of the domain has failed to respond > to your bid in spite of our repeated reminders. > > If the seller does not respond within 14 days, this bid thread will be > considered inactive and no additional reminder emails will be sent. > > To prevent this, you can extend the negotiation period by making a higher > counter-offer What would u do in such case? the domain status says: Status: clientDeleteProhibited Status: clientRenewProhibited Status: clientTransferProhibited Status: clientUpdateProhibited Updated Date: 30-aug-2013 Creation Date: 15-aug-2002 Expiration Date: 15-aug-2014"} {"_id": "4295", "title": "Free Web Hosting", "text": "i want to launch(publish) my site on the internet? so some one can tell me about free hosting?"} {"_id": "50694", "title": "Multiple level subdomain's effect on SEO?", "text": "For example if I'm selling a blankets: `incredibly.soft.blankets.mydomain.com` How would that affect my SEO? Would Google penalize me for this?"} {"_id": "50697", "title": "Why Google does not fetch some $_GET parameters in URL?", "text": "I have StackOverflow style URL's in my site, like `mysite.com/article/42/the- article`. However, when I search for the article, I see that Google has fetched it like this `mysite.com/article/42/`. Am I doing something wrong here? Should I tell Google to fetch everything? ## Edit: I forgot to add that these URL's are converted to php $_GET URL's in the .htaccess file. So the URL's actually point to this: index.php?what=content&id=42&title=the-article And the reason I'm surprised is, the links are in the homepage and all of them uses the form with titles. So there is **no** links in this form: mysite.com/article/42 All links are in this form : mysite.com/article/42/the-article"} {"_id": "4296", "title": "What are examples of high-volume websites with fixed, proportional, fluid, and hybrid layouts?", "text": "For example, Amazon.com I believe would be an example of a hybrid-fluid-fixed. Really have no idea what an example of a purely proportional layout would be. * * * **Types of Webpage Layouts** are, per Wikipedia's Web Design page: * **Fixed layout:** Pixel measure results in fixed or static content * **Proportional layout:** Em measure results in proportional content that is relative to font-size * **Fluid layout:** Percent measure results in fluid content that shrinks and grows to \"fit\" display windows * **Hybrid layout:** Incorporates any combination of fixed, proportional or fluid elements within (or pointing to) a single page."} {"_id": "60924", "title": "The previous owner of my domain hosted pirated content, what is the best course of action?", "text": "Judging by my error logs, the Google craler errors, and the massive amount of DCMA takedown notices I get it has become apparent that the previous owner of my domain name (`ianspence.com`) used to host pirated content, specifically software. A quick peek in my error logs shows hundreds of URLs like: * `/graphic/downlaod/full-version-foxit-reader-3141125-free-download-keygen-crack.html` * `/security/downlaod/full-version-fix-it-utilities-professional-10334-free-download-keygen-crack.html` * `/mobile/downlaod/full-version-tunebite-7120101-free-download-keygen-crack.html` * `/antivirus/downlaod/full-version-avg-anti-virus-plus-firewall-internet-security-90730-free-download-keygen-crack.html` Looking at Google Webmaster Tools shows me that Google is trying to crawl these links as well, always returning 404. Last year, right after I purchased the domain, I received a flurry of DCMA takedown notices for links like the ones above that were indexed on Google. I never did hear back from Google after contesting every single takedown, but they did stop. My question is, what is the best course of action? Are these old URLs hurting my sites reputation? How can I stop them?"} {"_id": "4290", "title": "Apache 2 virtual host configuration for subdomain redirection", "text": "I'm trying to set up a `mail` subdomain on my site that will simply redirect to my Google Apps mail account. I thought I could use the Apache 2 vhost configuration below, but it's not working: ServerName mail.foo.com Redirect 301 / https://mail.google.com/a/foo.com/ What else do I need to get this to work?"} {"_id": "50692", "title": "Former designer registered my domain in their name and refuses to transfer it", "text": "When I hired a local company to design my website they charged me a low one- time fee and a monthly maintenance. Sounded like a great deal at the time, but now I'm not happy with it and I want to go with another company. The problem is the 1st company registered my domain in their name and they refuse to transfer or sell it. My company is BUILT on this site. I feel it is mine. They have me over a barrel and say they don't have to release me from my contract? What can I do? I'm one of around 50 companies in town that are having this issue."} {"_id": "28656", "title": "What process do you use to determine the \"why\" of web searchers goal for your \"transactional keywords\"?", "text": "SEO starts with the intentions of the person using the search engine. What ever the web searcher intended to gain from typing their search keyword/phrase into google will help you understand what needs to appear in their search results to entice them to click on your money generating link (a link that generates a sale, which was found using a transactional keyword). The web searcher's intent is \"why\" they performed the search in the first place. Ambiguities between why a web searcher searched and if a particular transactional keyword causes them to click the link when it is not related to their intent, results in wasted Pay-Per-Clicks; and this is why knowing the intention (the why) of the web searcher's search is important. (Am I confusing AdWords with SEO here?) Additionally, the text displayed in an individual search result (the page title, and the slug text) can match the web searcher's intention and cause them to click the link resulting in a sale, which is what we as webmasters want. So as webmasters, I ask you, what do you do to determine why people are searching for your transactional keywords?"} {"_id": "28657", "title": "Django or PHP? Which is my Future?", "text": "I started to learn Python. What do you prefer for my future. Should i learn Django or PHP to build sites. Is there a review about Django vs PHP?"} {"_id": "23383", "title": "Hiring security auditors...what should I know?", "text": "I want to hire someone to do a security audit of my website but I'm not sure how to go about it. Where are good places to look for an auditor? Besides a list of referrals, what should I be looking for in an auditor? What qualifications should he/they have, and how can I verify them?"} {"_id": "57546", "title": "Redirect to the correct URL without trailing slashes", "text": "My correct URL is `http://mydomine.com/cars/register.php` (without slash in the end). But I send emails to our customers with wrong URL `http://mydomine.com/cars/register.php/` (with slash in the end). Customers cannot access the page correctly. How to re direct them to the correct URL when they click on the wrong link?"} {"_id": "45267", "title": "Removing trailing slash for sub directory url indexing", "text": "I have a site with multiple different sites in sub-directories. I want to view the sub directory in two formats. For example: 1. `www.example.com/blog` 2. `www.example.com/blog/` Both should show the index page under blog folder. After turning DirectorySlash off, redirection is off from `www.example.com/blog` to `www.example.com/blog/` but in `www.example.com/blog`, all contents of that folder are showing instead of the index file."} {"_id": "23385", "title": "How effective is guest-blogging for SEO?", "text": "I was wondering how much time to devote to guest-blogging. It is pretty consuming and one person can only do a few per day max. And often you just get a link in the bottom of the post. Is it really beneficial to SEO? And if so, are there any particular tricks or pitfalls to it?"} {"_id": "29944", "title": "How to remove old robots.txt from google as old file block the whole site", "text": "I have a website which still shows old robots.txt in the google webmaster tools. User-agent: * Disallow: / Which is blocking Googlebot. I have removed old file updated new robots.txt file with almost full access & uploaded it yesterday but it is still showing me the old version of robots.txt Latest updated copy contents are below User-agent: * Disallow: /flipbook/ Disallow: /SliderImage/ Disallow: /UserControls/ Disallow: /Scripts/ Disallow: /PDF/ Disallow: /dropdown/ I submitted request to remove this file using Google webmaster tools but my request was denied I would appreciate if someone can tell me how i can clear it from the google cache and make google read the latest version of robots.txt file."} {"_id": "23386", "title": "What are some software products and tools to use in order to increase user-conversion?", "text": "What are some ways other than in-person tests that webmasters can do to try to increase user conversion and participation?"} {"_id": "50930", "title": "Title placement for good SEO?", "text": "Where should the `` tag be placed for best SEO in relevance to `<meta name=description content=\"\" />` tag? Does placing it upper the `<meta name=description content=\"\" />` or below it, or does it have not have any relevance at all."} {"_id": "24765", "title": "Meta field for Domain Url? Or is it possible to change the index of google", "text": "So I did the misstake of using a temporary url a while ago when launching my web site and called it http://web.mysite.com Now when google indexes it, even that the web. is not the primary url anymore it still uses that over the http://www.mysite.com Is there any way I can change this? Unfortunately I cant remove the web.mysite.dom binding from IIS since all Google links refer to that and I cannot use wildcard binding on the actual server. I have google analytics enabled with the correct url (www.mysite.com). Is there a way to enter some kind of meta data that enforces the robots to see the address as www.mysite.com? Thanks"} {"_id": "37572", "title": "301 redirect to different directory on Yahoo Small Business Hosting without .htaccess", "text": "I have a website hosted with Yahoo Small Business Hosting, and I don't have access to use a _.htaccess_ file. I have around 220 pages in a folder `mysubfolder` (`http://example.com/myfolder/mysubfolder`) and the age of website is around 3 years. I am planning to move all 220 pages in `mysubfolder` to `myfolder` (one level up). All the pages in `mysubfolder` are indexed. What is the best way to do this, so that it wouldn't affect my SEO."} {"_id": "22034", "title": "MODx CMS and SEO", "text": "Does anyone have any experience with MODx CMS? How good or bad is it for a MODx-based website in regards of its SEO?"} {"_id": "22035", "title": "Firebug Disabled on Some Sites?", "text": "I have noticed on some website that firebug is off while on other sites it is on. I have not made any changes to these settings but how does a site tell firebug to turn off or how does firebug know do not turn on for this site?"} {"_id": "22039", "title": "Exit link tracking with timestamped logs on 3rd party content", "text": "I want to track clicks on exit links, that are placed in 3rd party content, for example on Twitter. I also need the timestamps of the clicks. Google Analytics can't be embedded in 3rd party content. Another solution is to use a URL shortener like bit.ly. However, bit.ly or goo.gl don't log the time of the click with any better granularity than a full day. su.pr shows the time for the past day in its analytics graph. The analytics download only includes the day, not the time. cli.gs was touted as having the most detailed analytics, yet it doesn't show the time either, and forces the user through a preview page. Hootsuite/ow.ly doesn't let you drill into analytics intraday."} {"_id": "61253", "title": "How my robots.txt should look like for a single page app", "text": "I can understand how to disallow bots to crawl some pages/folder in normal application. For example for google-bot it is nicely described here. But what should I do if I have a single page application (the one that uses only ajax to upload new content and has routing and page generation on the client). How to make it crawlable is described here and here, but what if I do not a bot to follow some links (that are on my starting page)? By this I mean the following: When SPA is loaded for the first time it loads some basic HTML. This html can have specific links like: * home (#!home/) * about (#!about/) * news (#!news/) but I do now want a bot to crawl #!about link."} {"_id": "55682", "title": "Website won't work without www on mobile phone", "text": "I have updated the _.htaccess_ file so that when someone types in `example.com`, they are redirected to `www.example.com`. This works great on my computer, but when I try it on my cell phone is does not redirect. Instead I get the \"Webpage not available\" message. Does anyone know why? I've designed many websites and have never run into this problem."} {"_id": "61254", "title": "Google webmaster tools: parameters that only apply on one page", "text": "I'm trying to get my e-commerce website on google and still figuring out how it all works. Now, I have seen this feature named URL-parameters, allowing me to set different parameters that affect page content to be indexed (one can also set parameters that do not affect the page, but for me that does not apply..). The question I have about this is whether and how I should add parameters that I only have on some pages of my site. example: The homepage of my site is `www.mysite.nl`. no parameters at all. But when a user clicks the navigation bar, it links to `www.mysite.nl/itemList.php?category=&....subCategory=....` The parameters `category` and `subCategory` define whether there is content on my `itemList` page and what content that is. It gets matching products out of my database based on those 2 variables. The question: How do I make sure that I apply the google URL Parameters function decently for my website?"} {"_id": "16928", "title": "Google Ranking for long tail search phrases", "text": "When using long tail search phrases for products (items to purchase), google generally returns multiple URLS for various ecommerce sites, but it hides additional urls for each site behind ![show more results](http://i.stack.imgur.com/NuuYC.jpg). Each site tends to get one URL in the SERP and the rest are hidden (this isn't always the case as sometimes it shows 4 or 5 pages for a given domain but they are almost always grouped together). Since the related pages for each domain show up grouped together in the original SERP, is google using domain authority to determine ranking for the collection of pages, or is it ranking the entire set of pages for the domain based on the one page it chose to display in the SERP?"} {"_id": "35660", "title": "Will Google harm my ad's rating if my display URL in adwords does not exist?", "text": "I have a service in a very big city, and to make my ads stand out, I display the name of the neighborhood in the display URL, although the actual URL is always the same page with my services for any neighborhood. Will google decrease my rating for this? In a different note, you can also express your ideas about if it's sensible to do this or not, ie, the client might feel I'm deceiving them. Although this will be subjective, unless there is some scientific study. Thank you!"} {"_id": "35662", "title": "Where do I set my SPF record for domain managed by Yahoo?", "text": "Many years ago we purchased a domain from Yahoo. Now our website is hosted on Amazon EC2. The output of an SPF checking tool (http://www.kitterman.com/getspf2.py) says: > SPF records are primarily published in DNS as TXT records. > > The TXT records found for your domain are: i=182&m=bizmail-mx2-p9 > > SPF records should also be published in DNS as type SPF records. No type SPF > records found. > > Checking to see if there is a valid SPF record. > > No valid SPF record found of either type TXT or type SPF. Where do I get access to these values? Can somebody speculate, where can I find an interface, or a configuration file to fill in the missing fields? **Edit** To clarify, the domain name is managed by Yahoo and the actual server is an EC2 instances being routed via an elastic ip. The best I could on the Yahoo control panel only mentions CNAME, A and MX. Where is SPF? Here's a Yahoo help article covering this."} {"_id": "48127", "title": "Caches and Website Screen Shots", "text": "If I am creating a website to filter retail websites, like shipping or eco- friendly, etc. Could I use the screenshot of the homepage as the icon to link to the direct website? Ie. if my filters result in the The Gap, www.gap.com, and there is a snapshot, cache of the homepage, am I infringing on their ip?"} {"_id": "44640", "title": "Online Bookmarks", "text": "I want to save my bookmarks online so that I could access them from anywhere, from any computer. So I want to know the site name or if there is any tool for it as I cant remember lots of site's name, which is useful for me."} {"_id": "54532", "title": "Google Webmaster Tools incorrect rel-alternate-hreflang implementation warning message", "text": "I'm getting this warning message in Google Webmaster Tools: > Incorrect rel-alternate-hreflang implementation In particular, there seems > to be a problem with missing or incorrect bi-directional linking (when page > A links with hreflang to page B, there must be a link back from B to A as > well). This message seems pretty straight forward, but when checking their example pages, I'm not finding anything wrong. I'm using alternate for translation of main site menu, titles, etc. In each page I have this: <link rel=\"alternate\" hreflang=\"en\" href=\"http://mydomain.com/page\" /> <link rel=\"alternate\" hreflang=\"jp\" href=\"http://ja.mydomain.com/page\" /> <link rel=\"alternate\" hreflang=\"ko\" href=\"http://ko.mydomain.com/page\" /> <link rel=\"alternate\" hreflang=\"th\" href=\"http://th.mydomain.com/page\" /> <link rel=\"alternate\" hreflang=\"es\" href=\"http://es.mydomain.com/page\" /> <link rel=\"alternate\" hreflang=\"pt\" href=\"http://pt.mydomain.com/page\" /> I've double checked this exists in all the 6 pages. This is the first time I've seen this message although I've implemented this at least 6 months ago, and implementation hasn't changed. Is there any way to check a specific set of pages for these things? Am I missing something in my implementation? We're auto-redirecting people from a location to their specific language, and give them an option to manually change this. I've also just found out about the suggestion for `Vary HTTP header` - is that relevant and important here?"} {"_id": "31968", "title": "Using RegEx's in Multi-Channel Funnels in Google Analytics", "text": "For some reason, I can't get my multichannel funnel which utilizes RegEx's in the path steps to function -- it keeps coming back with no data. There are a few variables which may be holding things up, but I can't figure out the origin of the problem, nor a solution. Here's the situation: * The funnel is tracking conversions, defined as when a user completes 4 steps to signup * Steps are not \"required\" * Default URL is set to `https://example.com` * There is a 302 redirect set up on our site that leads from `http://example.com` to `https://example.com` * Within the funnel, steps switch from non-secure pages (unless browser is set to secure browsing), to secure pages once the user moves from the landing page to the second page of the sign-up process (account placeholder has been created) * URL at that point contains the variable of publisher number within (but not at the end) the URL * My RegEx's are all properly written as tested on rubular.com"} {"_id": "4748", "title": "How to hide dns when someone whois me by use domainsbyproxy?", "text": "i register domain with godaddy and i use addon privacy protect by domainsbyproxy but still show dns server when whois? how to make its to private?"} {"_id": "43338", "title": "Drupal 7 shared cloud hosting problem", "text": "I'm hosting one drupal 7 website on site5.com cloud shared hosting and it seems to be that drupal cache queries go very slow on almost every table that starts with name cache_... This site is using 1-2 custom modules, i18 module only for front page text, about 13 custom types fields and has about 650 nodes. Now site has been used only me and my client. I contact hosting and they say that problem aren't their, but I have been thinking, I have one drupal 7 site, with webshop in croatia with 7000 nodes and active 30 users per seconds on poor host and everything is going well. My questions are, it can be something with a new drupal 7 update (7,19) or my website is really that slow (highly doubt) or cloud shared hosting is only a cheap trick from site5.com?"} {"_id": "57795", "title": "how to make backlinks for Malayalam sites either english or Malayalam", "text": "I am new to SEO.I need help ? The website is fully in Malayalam content so how to make backlinks?But keyword is English. any body pls help me"} {"_id": "57788", "title": "Google news not showing images from my site", "text": "I have a question for you. One of the sites I'm working on has been accepted in Google news. Now, my issue is that most of the time,no image is fetched, while some other times a related image from another website is shown instead. I have read somewhere that the image needs to be nearby or in the title tag - is it so? Any idea what I am doing wrong?"} {"_id": "13670", "title": "help me being the next zuckerberg", "text": "I really expect someone to help me with my problem as stated below. I started learning website making by having tutorials from w3schools.com but after doing taking the HTML tutorials I don't think I really can do anything with it. Maybe I don't know everything about HTML yet. I know the online tutorials are not enough but what I need you guys to tell me is that how can I learn making some nice websites. I have vacations for two months now and I think it would be the best time for me to learn making websites as I really love to be a website developer. Now I would like to start from the scratch. I know some C language _(though I know it will not help me)_. So tell me from where shall I learn HTML and what all things shall I learn after that and in what order. If you can also help me with the material _(let it be books or online tutorials)_. But I would like to know how can I gain confidence that I can make some websites as just learning online tutorials doesn't help me much. I also tried some video tutorials but in that too, they will just tell me some of the functions and not all. So what am I supposed to do.. just know a limited functions as they want me to know in video tutorials or do something else."} {"_id": "13671", "title": "Track Promotional Code Sales", "text": "Is there a way I can track actual sales on purchases utilizing Promo or Discount Codes obtained through my site? My site will link to e-commerce sites where users can use those promo codes on their purchases to save money. My site will not actually be selling any items, it is all referrals to other sites. I want this to be done outside of any 3rd party commission platform such as Commission Junction or LinkShare. Thanks!"} {"_id": "31967", "title": "Is there a way to do a site-wide 301 redirect or rel=\"canonical\"", "text": "I've got a blog running on a sub domain mirrored using this method on to a sub directory, so that the links come to my main domain not the sub domain.. Obviously this creates duplicate content. What I was planning on doing, was to block googlebot from seeing the sub domain using robots.txt and then get it to index the sub directory. What i would also like to do is insert 301 redirects from each of the pages as well as rel=\"canonical\". But because this is a blog there is always new content being added so I would have to add new pages to the 301 each day and add in new rel=\"canonical\" links into each page. Is there a way to write with the 301 redirect & rel=\"canonical\" that anything after the slash stays the same but the domain name itself will change? ie. blog.mysite.com/post1 - would automaticly be redirected to - mysite.com/blog/post1"} {"_id": "4745", "title": "Does page impression get counted for unique visits?", "text": "in Google adsense, if a user from any territory revisits page impression is getting updated but that eCPM is not getting updated its showing zero only, does it work only for unique visitors? Thanks Sneha"} {"_id": "13675", "title": "What should I do if I have 2 UI tabs with content that allows the user to access the same page?", "text": "I created an employee review application that has top-level stateful tabs just like the ones on this site. There is a tab called \"My Employees\" and a tab called \"Employee Management\" (used by HR only). The issue that I have is that when you are on the \"My Employees\" tab or the \"Employee Management\" tab, the associated information that is displayed both give the user access to the \"View Employee\" page. So right now, since you can access the \"View Employee\" page from both of these tabs, I am deactivating all tabs when the user is viewing that page. I'm doing this because I don't know which tab to have active. Does anyone have any suggestions? Is there a better UI navigation technique that I should be using if I have these types of scenarios, or does this indicate a flaw in the UI design in general? Is it ok for \"View Employee\" to be accessable from 2 tabs? Should I just use stateless tabs (buttons) or is that poor design as well? Thanks and I appreciate any help you can provide."} {"_id": "16002", "title": "Using Google Webmaster & Analytics, what data to look at to improve website performance?", "text": "Using data from Google Analytics and Webmaster tools, what data should I be looking at to improve my websites performance? I want to improve the SEO, usability and just general performance of my website. EDIT: It's a portfolio website that we've done the initial SEO for, also optimised all images etc and made the site as fast as possible. What kind of things should I be looking out for in the analytics and webmaster data to improve performance for both the SEO and each individual page. EDIT: Is there kind of a weekly/monthly checklist of things to check to improve SEO and performance."} {"_id": "32243", "title": "Using nofollow when crosslinking my own sites", "text": "I have 20 sites in the same domain, different niches and I want to interlink them for USER EXPERIENCE. I want to know from someone who tested this: Will using a nofollow attribute for the links keep the sites safe from google link scheme penalties?"} {"_id": "55643", "title": "Will search engines penalize spam links in Facebook comments for websites?", "text": "I have many static HTML web pages in which I had used Facebook comments. A user had posted a spam link in one of the page's comments. I logged in as a moderator and deleted it, banned the user, and marked it as spam. Yet Facebook displayed the message: `Comments from USERNAME will only be visible to his friends. Undo`. As the message says, it provided an undo link. The problem is in the future when search engine bots crawl Facebook comments, and if they use the links posted in the comment which lead to spam websites, then I may get a penalty for linking to spam sites. So how do I remove the comments entirely from Facebook, so that not even the user's friends would see it? May be if only their friends can see it then search engines cannot crawl it. But if his friends use those links then that might add a link to the target site, which leads to my site being the referral to the span site. So what should be done?"} {"_id": "19761", "title": "https & ajax crawling", "text": "We made on our webpage https://www.1point618.com a transition to ssl and now we using nearly entirely ajax to load the content. Therefore all urls of existing pages have changed. We used the 301 redirect as recommended, also we have implemented google's specification that the webpage is still crawl-able. We thought that maybe it would last a month that we have the same ranking in google's search results, but still google's search results are much worse than before these changes. Most of the content (artist profiles) isn't indexed anymore. For example of the submitted sitemap only 3 of around 450 urls are indexed. Before almost all urls were indexed. My question is now: Does google's ajax crawling work together with ssl? (It looks like it would work, cause of the access log file.)"} {"_id": "5869", "title": "List of all URI elements for a website", "text": "Does anyone know a good way of getting a list of all the URI elements for a website? I plan on moving a website to a new CMS and would like to setup some 301 redirects for articles, images, css, and js files that will be moved to a new location. I'm looking for somewhat of a \"site sucker\" application that creates a list of URI elements for a website, instead of downloading them."} {"_id": "32247", "title": "Poor backlink profile - search rankings not updated for 2+ months", "text": "I am carrying out some work on a website that is a PR2 with a few good quality, relevant backlinks (PR4-6). It has a presence on Twitter that is updated regularly, a Google Places listing, and listings on some decent directories (Qype etc). The site was rebuilt into Drupal 7 two months ago, with all the basics done - URL rewriting, XML Sitemap submitted to Google, and most importantly, good quality, structured content. I've noticed that Google is still showing \"old\" URL's from the previous version of the site that was ditched 8 weeks ago. I think the site may be penalised under the Penguin update, as a previous SEO company created many low quality links from link farms/directories. My question is what the correct way to deal with this is. Bing Webmaster Tools can \"disavow\" links, and I guess I can attempt to contact the link farms to have them removed. I've already submitted a request to Google to request that we have the penalty removed as we're trying to tidy up a bad history. We submit updated sitemaps to Google and Bing daily, and have built some further decent quality, relevant links. Is there anything further I can do?"} {"_id": "16004", "title": "which status to put for temporarily inactive page", "text": "I was wondering if someone could help me how to manage temporarily inactive website in regards of SEO and search engine. the case is i managed a big ecommerce site, and sometime i need to put down page(s). could be days, could be weeks, could be months, and it depends on our vendor. if my visitors land on the page that been temporarily inactive then i can give them a message that the vendor they looking for is not available at this time and he can check back later OR check another vendor with similar products, but how do i send my message to search engine robots? if i use 301 status and forward URL page to another similar products, then the chance that the current URL being deindex is huge while i still want to use that URL for the future if my vendor want to re-join. any advise will highly appriciated"} {"_id": "49168", "title": "Multiple top level domains - should I redirect/rewrite or leave them?", "text": "I have a website which has many top level domains e.g. * mywebsite.com * mywebsite.org * mywebsite.co.uk * mywebsite.org.uk The website is advertised as `mywebsite.org.uk`. But I expect lots of people will type `mywebsite.com` or `mywebsite.org` because its more familiar/easier. I could just leave the url alone or I could redirect visitors to the \"correct\" `mywebsite.org.uk` domain. Is there any guidance on what I should do?"} {"_id": "55648", "title": "SEO to avoid possible duplicate content", "text": "My website is about bands and musicians, but I have two problems with it: 1.) A musician can create a profile and add his bio. Bands can also create a profile and add band members and their bio. Many musicians are members of several bands, so I will have the bios of them repeated on several pages of my website. Any idea how to control this since the site will be populated and controlled more or less by its users? Here's an example: A band has 10 members and will create a \"band\" profile. Lets say half of the band members will create their own profiles, while half don't care and will never do this, or they have it on some other website and won't bother doing it again on my website. 2.) If musicians will have bios on my website, there is also a (huge) possibility that they also have their own website with the same bio that will just be copy-pasted, or have the same bio on MySpace and similar websites, which will again result in duplicate content. If I disable the option for users to add their bios, then the whole concept of the website doesn't make sense."} {"_id": "5867", "title": "convert pdf to jpeg on linux?", "text": "Sometimes clients email me a poster which they received from a graphic designer in pdf format that they want added to the site. I have generally told them to ask the designer to export it in jpeg format. This has been working out OK, but is there any easier option? The only other way I have figured out is to do a screen shot of the pdf and then crop it using gimp, but that's kind of time consuming."} {"_id": "19769", "title": "chmod 700 and htaccess deny from all enough?", "text": "I would like to protect a public directory from public view. None of the files will ever be viewed online. I chmoded the directory to 700 and created an htaccess file that has \"deny from all\" inside it. Is this enough security or can a hacker still gain access to the files? I know some people will say that hackers can get into anything, but I just want to make sure that there isn't anything else I can do to make it harder to hack. Reply: I am asking if chmod 700 and deny from all is enough security alone to prevent hackers from getting my files. Thanks."} {"_id": "12230", "title": "Image Composition Services for Ecommerce", "text": "I'm researching the market for image composition services for ecommerce software. Systems that will manage high resolution images for product variations (multiple colors, different text/logo on the same image of a lady, etc.) Specific questions. * Scene7 seems to be the leader in this space. Are there lower cost alternatives to Scene7? * What sort of costs **can** I expect for this kind of hosted service? Scene7 doesn't list prices, which leads me to believe they have metrics to size up each client and maximize profits. * Outside of the software as a service market, are their open-source and/or commercial self hosted solutions? * Not necessary, but any services with out of the box Magento integration jump to the front of the line."} {"_id": "56667", "title": "Visitors with at least 1 event click? (Visits with Events)", "text": "I might be getting a bit too lost in the metrics to realize that I already have the numbers I'm looking for, but I'm trying to figure out how to report the number of unique visitors who visited a page that interacted with a specific element and caused an 'event click' on a page (So # of visitors who caused an event). I currently have a dashboard set up with a widget that reports, in a table, the number of Unique Visitors by Event Category. However if there are multiple labels ('Next', 'Back', 'More Info') are the multiple labels messing with the visitor number that Analytics is reporting? So say we have a slideshow on our page (along with some other content); I just want to see the number of 'Unique Visitors' that clicked ANYTHING on that slideshow. So Person A clicks 'Next' - that registers are 1 unique visit. Person B clicks 'Next', 'Back', and 'More' - so now we're up to two Unique visits. Person C visits the page but doesn't click anything, so we're still at 2 unique visits. I...might already have this down with my widget...but the metrics are seriously scrambling my brains. Any advice?"} {"_id": "35643", "title": "Is trailing slash automagically added on click of home page URL in browser?", "text": "I am asking this because whenever I mouseover a link to a home page (e.g. http://www.example.com), I notice that a trailing slash is always added (as observed on the status bar of the browser) whether the home page link contains a href attribute that ends with a slash or not. But whenever I am on the home page, the URL on display will not have a trailing slash. I tried entering a slash to the URL in the URL bar. And with Firebug enabled, I notice that the site always return a 200 OK status. An article here discussing this states that having a slash at the end will avoid a 301 redirection. But I am not seeing any redirection, even on this page. Could this be a browser feature that is appending the slash?"} {"_id": "52349", "title": "What is the screen size used by Google's PageSpeed Insights for mobile previews?", "text": "Google's PageSpeed Insights allows one to check a website's speed issues for both desktop and smartphone devices. I have some media query CSS for smartphones on my website, but I am unsure about PageSpeed's smartphone screen size. Does anyone know the screen width?"} {"_id": "17358", "title": "Which is better jplayer or jwplayer?", "text": "I need to make a video slideshow( slides to contain embedded video players and must play mp4 files), and I am kind of confused what would be the best free solution I can use? jplayer and jwplayer both seem good, but I can't find any comparisons between them, so I want some expert opinions and user experiences on these two. Also if there are any other free and open source solutions you can recommend, i am all ears ;) My requirements are: The player must be html5 with fallback to flash. Should support mp4 files as well as flv. Should also have the ability to double as an audio player(mp3) along with video capabilities. Should be lightweight and customizable easily in terms of looks."} {"_id": "17609", "title": "Publishing a web application with Linode", "text": "I just purchased and set up a Linode VPS. Normally I purchase a new domain name and set it up with my hosting but on Linode didn't ask for that. What I have to do to publish my web application to Linode?"} {"_id": "56845", "title": "Why does internet explorer want to preload these 2 linked html pages?", "text": "After running a WebPagetest.org for one of the pages of our website, I noticed - because of the red marked http file not found errors - that Internet Explorer (IE) wants to preload two html pages after the last javascript has been loaded. Not all linked html pages are being preloaded by IE. The undesired page preloading happens with both IE7, IE8, IE9 and IE10. The links are encapsulated within a `li` tag, to be precise the path is: `html > body > ul > li > a` . I can't prevent this unwanted IE page pre-loading by adding a `rel=\"nofollow\"` to the link: http://crashplan.probackup.nl/bugs/ie-link-will-preload- page/a-rel-nofollow2.en.html and an IE8 test result. Why does Internet Explorer want to preload these specific linked html pages in the first place, and how to prevent this undesired preload?"} {"_id": "17607", "title": "Remote desktop to Ubuntu", "text": "I have a VPS server running Ubuntu OS and also have shell access to my server. I wonder is there any way to connect to Ubuntu desktop and see its UI from windows using remote desktop tools such as TeamViewer or what ever apps? Is that possible or I have to deal with my server using just command line?"} {"_id": "17605", "title": "What is wrong with this Apache httpd.conf file for vhosts?", "text": "**What is wrong here?** NameVirtualHost *:80 <VirtualHost *:80> ServerName www.apples.co.uk DocumentRoot /var/www/apples.co.uk RewriteEngine On RewriteCond %{HTTP_HOST} !^www\\.apples\\.co\\.uk$ [NC] RewriteRule ^(.*)$ http://www.apples.co.uk/$1 [L,R=301] </VirtualHost> <VirtualHost *:80> ServerName www.bananas.co.uk DocumentRoot /var/www/bananas.co.uk RewriteEngine On RewriteCond %{HTTP_HOST} !^www\\.bananas\\.co\\.uk$ [NC] RewriteRule ^(.*)$ http://www.bananas.co.uk/$1 [L,R=301] </VirtualHost> **Problems:** * `apples.co.uk` does correctly redirect to `www.apples.co.uk`, but `bananas.co.uk` redirects to `www.apples.co.uk`. * When either `apples.co.uk`or `bananas.co.uk` redirects, the address bar results with `http://www.apples.co.uk//`. What is causing that extra slash at the end? Here the output of `apachectl -S` : [Sun Jul 31 08:41:52 2011] [warn] NameVirtualHost *:80 has no VirtualHosts VirtualHost configuration: wildcard NameVirtualHosts and _default_ servers: *:80 is a NameVirtualHost default server www.apples.co.uk (/etc/apache2/httpd.conf:3) port 80 namevhost www.apples.co.uk (/etc/apache2/httpd.conf:3) port 80 namevhost www.bananas.co.uk (/etc/apache2/httpd.conf:11) port 80 namevhost servername.apples.co.uk (/etc/apache2/sites-enabled/000-default:1) Syntax OK I guess the first problem is caused by the `default server` setting. How do I remove that? Does there need to be a default? I know I also need to add to `sites-enabled` to get rid of the warning, but I'll leave that for another question."} {"_id": "17604", "title": "JavaScript control of <audio> not working in Firefox, works in Chrome", "text": "<audio id=\"audioMusic\" loop=\"loop\"> <source src=\"/Music/behind_you.ogg\" type=\"audio/ogg\" /> <source src=\"/Music/behind_you.mp3\" type=\"audio/mp3\" /> </audio> <script type=\"text/javascript\"> var a = document.getElementById(\"audioMusic\"); a.play(); </script> **Problem / Question:** This, along with countless other ways of controlling audio via JavaScript, works fine in Chrome, it does not work in Firefox or IE. Why? How can it be changed? By the way: Yes, I know I can use the `autoplay` attribute, that's not what I am looking for because I need to be able to turn it on and off and adjust volume via JavaScript. Yes, I know that background music is annoying."} {"_id": "17350", "title": "Web site analysis", "text": "I am looking for an opensource API for my existing website. The functionality that I am looking for is: 1. How many users visited the site in a particular interval of time? 2. What is the IP address the machine from where the request has come? 3. Which region/country/City the user belong to? 4. Which operating system that he is using? 5. Which browser has he used for sending the request? What are my options?"} {"_id": "17353", "title": "What would happen to my domains if my domain provider company bankrupted", "text": "I am thinking to transfer all my domains to one registerer so that the control will be easy. But I am a little bit worried about the following question: what would happen to my domains if my domain provider company bankrupted?"} {"_id": "17600", "title": "Why can't Internet Explorer render this page correctly?", "text": "I've been pulling my hair out but can't figure out why my page looks like a huge mess in IE9, but Firefox, Chrome, Opera, Safari works great. My website is here: http://173.244.195.179/test-o.html Could anyone tell me what I'm doing that isn't compatible with IE9?"} {"_id": "25387", "title": "Can a registrar ransom a domain name which has become popular?", "text": "I am slightly paranoid about my domain name I have. I think the site I am building could have potential and take off in the future. I am worried though if the people I registered the domain name with might try and extort more money off me or take away my domain name for themselves. I am guessing when Google first started out they must have bought their domain name through a registrar, so how did they get to keep a hold of it and make sure no one stole it from them or prevent the company they registered it with extorting more money from them? How much do you think Google pay to keep their domain name? Do you think its still the original price they paid back all those years ago? Do registrars do this? If so, what is the likelihood it could happen with my domain name? How can I protect my domain name? Who actually owns a domain name? Maybe I am just being paranoid but I would hate to see that when my domain needs renewal the cost jumps from $10 to like $1 million dollars etc."} {"_id": "47712", "title": "How to manage and organize 'requests' for website updates from a variety of departments: To Do List", "text": "I am taking on the responsibility of managing two websites (Wordpress) for nonprofit societies. I receive multiple requests for mini-content changes from a variety of committees and departments. As a volunteer, I'm hoping to park these requests into a chart or to do list that I can then use to work through a slew of easy, one minute content changes all in one go. I considered creating an excel sheet for this, but then thought that the professionals on PW.SE might be able to point to a template or to-do-tracking system that is already done up and better thought out than what I could cob together. When it's time to work, I need to direct myself to the page that needs updating, input/upload the desired changes, and then check it off the list."} {"_id": "4607", "title": "Is there an easy way to see amount of gzip compression in Chrome", "text": "I am busy checking how my webserver is doing gzip. I'm confident now that gzip is on as chrome shows the content-encoding: gzip header. Is there a easy way to see how much a file was compressed in the Chrome developer tools?"} {"_id": "4604", "title": "Google spam report, have they ever punished a bad site you reported?", "text": "You have reported a bad site on Google Webmaster Tools SPAM report. Assuming you are sure it was a bad site, **did you see Google punishing it or doing something after your report?** Thanks"} {"_id": "2188", "title": "Google Webmasters Verification Method", "text": "The Google-Bot seems to be hitting my site with the incorrect verification method. I have Google Webmasters set up to use a meta-tag for verification, but I am still receiving requests for the html file method (which result in 404s). Google Webmasters still shows my site as verified. Should I be worried about this? Could this be coming from another Google service? Should I switch to the html file method? Should I just use both? Here are the requests from the log just in case that's helpful. 66.249.85.2 - - [12/Aug/2010:08:56:04 -0700] \"GET /googlea6bf195e901587d1.html HTTP/1.1\" 404 124 - \"Google-Site-Verification/1.0,gzip(gfe),gzip(gfe)\" 66.249.71.118 - - [11/Aug/2010:05:40:57 -0700] \"GET /googlea6bf195e901587d1.html HTTP/1.1\" 404 124 - \"Mozilla/5.0 (compatible; Googlebot/2.1; +http://www.google.com/bot.html),gzip(gfe),gzip(gfe)\""} {"_id": "4602", "title": "What do you look at when viewing competitors websites?", "text": "BradB's post has me thinking about competitors websites, what do you look at, and how long do you spend doing it? I thinking along the lines of content (how much, how well written), site structure, SEO techniques."} {"_id": "60253", "title": "Why is App Pool Identity not created until first request is received?", "text": "I work for a software company and regularly deploy our web applications on our customers IIS boxes via an install wizard which was developed in-house. The install process by default creates a new App Pool for the application to run under, which is itself creates/is run under a default IIS App Pool Identity. The final step of the installer is to grant permissions on certain folders to the IIS App Pool Identity. Recently I've been working in a new customers environment and every time I do a new install this final step fails, as the App Pool Identity doesn't exist. I've found that browsing to the web app and rerunning the installer works as the Identity is created when the first web request is received. Why is this? This is unusual behaviour in my experience. I'm guessing this is something related to domain policy but haven't been able to find anything on the subject. As mentioned I can work around it but it would save time if I could get this changed and more importantly, I assume this is deliberate and want to know why! This is Windows 2008 R2 with IIS 7.5. (I hope I've got my terminology right)."} {"_id": "4600", "title": "Creating a facebook page with your domain name", "text": "I want to create a facebook page that has my domain name like this : http://www.facebook.com/TrimHold But I can't find the way. Any suggestions would be highly appreciated. Thanks."} {"_id": "410", "title": "Is it better to put hyphens in a domain name?", "text": "Both in terms of `SEO` and user friendliness, is it better to put hyphens in a multi-words domain name or not ? For example, is `www.stackoverflow.com` better than `www.stack-overflow.com` ?"} {"_id": "57654", "title": "Should I include dash in domain name?", "text": "Willing to buy a new domain name, I wonder if it worths in SEO terms by including a keyword with a dash or it would be better avoid dashes? Without a dash, it can be unreadable. See below: www.psychologist-smith.com or www.psycologistsmith.com or www.smith.com"} {"_id": "25570", "title": "Best SEO URL name for two word searches", "text": "> **Possible Duplicate:** > Is it better to put hyphens in a domain name? QUESTION: What would work better for a two word search string as \"green shoes\"? Acquiring the domain greenshoes.com OR domain green-shoes.com? Or another? Any other tips on how to score high with two word searches Help appreciated!"} {"_id": "55916", "title": "which domain is better for SEO purpose?", "text": "i'm building a website to convert PSD files to PNG and i have two options to choose a domain: 1. convert-psdtopng.com 2. convertpsdpng.com I need the word convert to be there too. Which one do you think is better to rank on \"convert psd to png\" keyword? Thanks!"} {"_id": "62904", "title": "Do you think that adding a dash (-) in domain names would impact in SEO?", "text": "For example, do you think that google and other search engines make a difference when one domain name is like www.siteaboutsoccer.com and other named www.site-about-soccer.com ? Is it better to separate the keywords ?"} {"_id": "15019", "title": "Using a hyphen in the domain name", "text": "> **Possible Duplicate:** > Is it better to put hyphens in a domain name ? I am looking to buy a domain where the obvious choices have already been purchased. Example: `domaintest.com` or `domaintest.co.uk` So, I am looking to mix things up a little with a hyphen. Example: `domain-test.com` or `domain-test.co.uk` I have been looking around the web & stack to get a solid picture of why or why not to use hyphens in the domain and am a little confused. The key factors for me are usability (can users remember the domain, can they type it easily on a phone etc..) and seo. Ian"} {"_id": "14475", "title": "SEO domain keyword adventage - alltogether.com or all-different-words.com?", "text": "> **Possible Duplicate:** > Is it better to put hyphens in a domain name ? I have heard that if someone searches for a term, and your domain name matches that term (e.g. stackexchange.com ranks well for \"stack exchange\"). If there are 2 (or more) words that you are targeting (e.g. \"stack exchange\"), do you get a better google keyword bonus if you make it alloneword or if you use hypens? e.g. If you want to target the search term \"stack exchange\", should you register stackexchange.com or stack-exchange.com ?"} {"_id": "68163", "title": "Domain Name has hyphens and exact match, should I change it?", "text": "The blog is currently a blogspot blog. Used part of the name of the real website that I got the idea to start a blog for. Time to get a registered domain for it, instead of the( name.blogspot.com )it is now I got itchy and bought a domain name that has 4 words and hyphens. Also, I believe it is an exact match domain. I see now that hyphens are possibly bad and so maybe are exact match domains. I am not sure if it is wise to use this new domain name or get a different one. Or, if I use this domain name should i just get it with no hyphens, just a run on of 4 words. A different alternative is to use the name of the site i originally started the blog for and add a couple words which I think would be exact matched domain. If I do that should I use hyphens or no hyphens. Summary, Use hyphens or not in domain with 4 words but the 3 word is actually the letter n such as cookies-n-creme for example. Or, use the real websites name plus the two word exact match meaning NAMEwordword.com If I go this route do i use hyphens or not. I hope you can understand this question. Hard for me to put it into typing. The reason it is exact match name is that it is actually descriptive of the blog and I haven't been able to come up with anything descriptive that isn't."} {"_id": "48838", "title": "404 errors How to deal with a large number of broken internal links for SEO", "text": "I have recently updated my website. In fact I have completely rewritten it using wordpress. It was originally constructed using ASP. In the process, I have recycled a a large amount of my original website. This resulted in a large amount of broken links. Google webmaster tools report more than 30000 broken links and most of them are internal. These mainly result from there being reported links to urls not used anymore. That is links to scrapped, obsolete url from the previous version that does not exist any more. From SEO perspective I have read that if 404 errors are from internal links, it is best to delete the links. When I click on a broken link from the list displayed at health-> crawl errors, google shows me where this broken link appears at 'linked from' tab. When I click on links from 'linked from' , the purported broken link is not displayed on the 'linked from' page. That is the users of my website has no chance of trying to load this broken link. The broken link still shows up in page source. i.e. when i try view page source from chrome. The broken link is usually used in some javascript To clafiry, let www.myhomepage/broken_link be the broken link reported by google webmaster tools. The google 'linked from' tab shows that this url was linked from www.myhomepage/some_other_page. When I view the page source, the broken link is usually in [script type=\"text/javascript\"]var _bpfbRootUrl=\"www.myhomepage/broken_link\";[/script] Again, i read that it is advisable to just delete internal broken links, but there are just so many of them and it would be very time consuming to manually delete each and every of them. What would be the best way to deal with this situation? I would like to avoid using custom 404 page. 302 does not seem like an option either, parsing url using regular expression to not redirect valid pages seem very complicated too Google webmaster tools report that index on my page had been sharply declining since it was rewritten with wordpress and dropped to 10% of original. Thank You."} {"_id": "23927", "title": "What is the reason for this site's sudden drop in traffic?", "text": "My one site's traffic has dropped in single day from around 4000 to 800 visitors. My other sites are getting normal visitor count. I'm not using any blackhat techniques nor trading links or having a bad neighbourhood or any copied materies. But yes it's a pdf documents download site with bounce rate of 32% and 3.5 pageviews per person. No issues reported in Google webmastertools. I find that for the relevant keywords my site has been pushed far way in Google. What could be the reason? **EDIT** I've uploaded two updated reports( webmaster top keywords and analytics 1 month graph). ![google analytics 1 month graph till yesterday\\)28th\\)](http://i.stack.imgur.com/fwmLw.jpg) * * * * * * ![Today's updated webmaster's top queries report from 22nd-26th\\(Monday\\)](http://i.stack.imgur.com/tg1Qs.jpg)"} {"_id": "69090", "title": "How do I \"bind\" social networks account to my blog?", "text": "My girlfriend has a WordPress blog for a while now, but when we check tools like websiteworth or woorank, it tells us that she has no twitter or G+ account, which is false. Is there a way to \"bind\" her social network accounts with her blog ?"} {"_id": "23920", "title": "Migrating a national website to an international version", "text": "I don't know if this is the right place to ask this. I have a Persian web service (online diaries): 31shab.com and it's been operating successfully for a year, and we've decided to create an international version of this and rebrand it to 31nights (meaning of 31shab in English). So I registered the domain 31nights.com and I want it to be multilingual. So I need to know whats the best way to do this migration successfully. * I want to move 31shab.com to 31nights.com/fa/ permanently (fa means Persian) in a way that doesn't ruin my SEO and Google ranking, etc. * What's the best way to structure languages? folders or subdomains or ... ? Thanks in advance and sorry for my English."} {"_id": "6450", "title": "How can i replicated this functionality?", "text": "http://www.astorandblack.com/ab/bespokevisualizer/ (click on \"continue as guest\") A client wants this \"bespoke visualizer\" on their site. Is there a third party plugin I can buy for this? It can be in Flash and/or JavaScript, it doesn't matter. Also, if there's no third-party plugin that's similar to this web app, then how could I code this from scratch? Thanks, Steven"} {"_id": "68609", "title": "Does editing a Wordpress Theme from scratch affect SEO?", "text": "I'm a web designer and I'd like to make a custom wordpress theme for my upcoming blog. The template is already made in photoshop but I need a good starting point. I don't have too much experience with Wordpress but I want something that is SEO friendly which I can edit without risking getting poor search results. I Know about Starkers and that there are other blank themes out there, but are they good for SEO?"} {"_id": "43555", "title": "Unique Visitor count increases when pages are excluded", "text": "For a date range greater than one day the total number of Unique Visitors is greater when I exclude pages ending in \"foo\" than when I look at all pages. The difference is ~4%. Why would this be happening? For daily totals of Unique Visitors the difference disappears and there are slightly fewer Unique Visitors when I exclude pages ending in \"foo\". Related to this, What is the difference in applying this through Advanced Segments vs. Filters? They both show subtly different answers. Cheers!"} {"_id": "60179", "title": "Can I view my PC localhost test site on my android tablet without uploading it?", "text": "I am working on a web site which is running on my PC via XAMPP. I want to test the site on my android tablet (galaxy tab) to make sure that the responsive design behaves that way it is designed to. I am wondering if there is a way to connect my tablet to my PC in a way that will allow me to view my localhost site on my tablet. This would allow me to fix bugs without having to upload every change."} {"_id": "53627", "title": "Web assets and SEO - albums on WordPress", "text": "I'm told that it is considered good SEO practice to have multiple digital properties \"around\" a website. For example, Youtube, Flikr, Social Media pages etc. Fine. I have linked my website to my G+ page, for example. I'm about to upload an album to WordPress. The pictures exist on my G+ album. I was just going to download the pictures from G+ and them re-upload them to my WordPress Gallery. Is there any SEO value in displaying the album directly from G+, rather than downloading and uploading to WordPress? This is a specific example, but what about other digital assets like Youtube videos? What's best practice here? Does this \"branch\" of SEO, if it exists, have a name?"} {"_id": "5531", "title": "Local SEO Strategies", "text": "What are your strategies for local SEO? For example, if you had a web design company in Lincoln, Nebraska, what would you do for your website from an SEO perspective?"} {"_id": "60986", "title": "How should I improve my position in map listing?", "text": "I don't understand on how the map listing are displaying the results in Google SERP. Please see the below screenshot, I just want to improve my SEO on map listings, but I have no idea to work on this. Anyone have experience on this ? and also why it is not showing for all the keywords and also I noticed it displaying on some browsers and sometimes not on other browser. why this is happening ? > ![enter image description here](http://i.stack.imgur.com/fJ39P.png)"} {"_id": "12700", "title": "Google SEO local ranking", "text": "> **Possible Duplicate:** > Local SEO Strategies The new feature on google search which uses local search (like users City), how do we make sure our website is hosted in other countries but content/location is specific city."} {"_id": "53256", "title": "Client's website is not showing up as expected in Google local searches or maps", "text": "I'm working for a client who runs a wellness center. He has a website as well as a Google Places account for the business. The Google Places account has a lot of tags associated with it that are relevant to the offerings from the center, and yet it doesn't show up on Google maps or in the local search results for really obvious searches like \"massage [name of city]\". I don't have access to the Google Places account yet, but I'm wondering if there's any way to check to see if there's some obvious problem with how things are set up. I can include the link to the website/Google Places pages. I didn't want to do that right away in case it wasn't allowed. Thanks in advance for any help! It doesn't seem to me like this question is a duplicate of the one indicated. In this case there **is** a Google Places account, but it's not showing up in the local search results or anywhere on the map for anything other than the name of the wellness center itself. If this Google Place account isn't working properly, would it be advisable to create a new one for the same location, re- verify with google, mark the old one for deletion and then hope the new one works better? Are there any obvious configuration issues I should look for that might explain why this center doesn't even show up on the map for a search of \"[name of first listed category] [name of city]\"? I have worked with Google Places before and never encountered this type of problem."} {"_id": "45248", "title": "SEO for multiple counties and cities", "text": "I wanted to know the best practices for improving our SEO for service areas. We have a central office but cover all of Northern Virginia for our Web Design services and are currently only ranking in the areas surrounding our office address. How can I improve our SEO for multiple cities and counties? I appreciate the advice! Thank you"} {"_id": "6904", "title": "Is it positive or negative for the average website to allow search engines to index user comments?", "text": "I've been trying to answer this question myself by researching and searching on google but I haven't been able to come with a final answer. I have some hypothesis I would like share anyway: 1. The average website intends to reach the higher audience possible. 2. User generated content by comments is an efficient way of increasing the relevance and the valuable information of the average website. 3. Indexing comments will make any site attractive to spammers. 4. Indexing comments is more convenient for new/low-traffic websites than it is for old/high-traffic websites. 5. The web in general is currently choosing not to index comments in the average website because of fear to the SPAMMERS. 6. The average author of a content-valuable comment would like his comment to be indexed 7. Not indexing comments discourages the creation of content-valuable comments. So, what do you think finally? Is it positive or negative for the average website to allow search engines to index user comments? Why?"} {"_id": "51128", "title": "Hidden page, but still crawlable?", "text": "Is it possible to hide a page from users, but still have search engines crawl the content? I have RSS feeds which are really messy and don't fit my theme, but they have content that I would benefit from having crawled."} {"_id": "21881", "title": "How to prevent copying of the image folder of your website", "text": "How to Prevent copying of the original image folder when the entire website is being copied. Also,what are the measures that should be taken to prevent guessing of the original images."} {"_id": "8686", "title": "Change color of link on hover within Google Sites", "text": "How can you change the color of a link within Google Sites when the mouse hovers over the link?"} {"_id": "19584", "title": "Is there a way or tool to search or know, how many results does google fetch and display with lots of keywords?", "text": "Is there a way or tool to search or know, how many results does google fetch and display with lots of keywords? I intend to know, how many relevant searches does google display but with thousand of keyword. Actually, I am researching on the competition on various keywords. If there is a tool or a quick way, other than going to google and typing and then entering the keywords."} {"_id": "27238", "title": "Users narrowing window on desktop to see mobile version?", "text": "I know this is going to seem like a strange question, but are their any stats on users narrowing their browser window so that media queries kick in and display the mobile optimised version of the site? We're about to mobile optimise our website. As we haven't done this we cant use our analytics to answer this question."} {"_id": "51122", "title": "Will having a section that aggregates news from another site's RSS feeds get me penalized for SEO?", "text": "SEO newbie here, I have a news section on my front page that essentially reads off another sites feed (`example.com/feed/`) and posts direct links to each article back to that site. Pretty much a mini RSS reader with one subscription. Will I get penalizing for duplicate content even if I link back? My question is similar to this: Will content from my site's RSS feed that is published on other sites be considered duplicate content by search engines? only that it is the other way around."} {"_id": "6902", "title": "Track page size of web app for performance", "text": "I'm about to start trialling optimisations on my web application (apache & tomcat). My main interest right now is: * The total page size (including external js files such as jquery) * The number of GETs per page I'd like to setup a systematic way of measuring the consequences of the various changes I make over time. Are there any tools that are useful for recording a sequence of page hits, playing this back (as I change the config) and record the size of each page ? Ideally I'd like something that can make work with a continuous integration server such as Continuum. Rgds, Kevin."} {"_id": "151", "title": "What should I consider when choosing a hosting provider?", "text": "I finished my website. It's awesome! How should I go about selecting a hosting provider? What features should I look for? What \"got-chas\" should I be aware of?"} {"_id": "52168", "title": "Adding direct advertisements to a mediawiki installation", "text": "I want to add sidebar ads to my mediawiki installation. I am planning on making direct advertisement deals, as opposed to using the built-in adsense/bidvertiser functionality. How do I put the ads in my sidebar?"} {"_id": "31837", "title": "How to block baidu spiders", "text": "Most of my visits are from baidu spiders. I don't think it helps search engines at all so I'm thinking of how to block them. Could this be done via iptables? I'm using nginx as my webserver."} {"_id": "155", "title": "What is a good WYSIWYG HTML Editor?", "text": "I've used Dreamweaver (albeit CS3 so I am a few versions back), but I am disappointed with the amount of invalid markup it produces. Are there other WYSIWYG editors out there that produce valid markup?"} {"_id": "54142", "title": "How should I structure a site with content dependent on visitor type (not user)?", "text": "I have a website that displays different content depending on two selections made by a visitor: Whether they are a teacher or student, and their learning level (from 4 options). Everything is public and they don't need to authenticate to access the content. Depending on their selection, different content is displayed across the whole site, other than a contact and about page. The tone of the language changes depending on whether the visitor is a student or teacher and the materials available on each page also change depending on the learning level, however in all cases, the structure of the site is identical. Currently I'm using a cookie to store the visitor's selections and render different content appropriately, so I have a single set of URLs which display different content depending on the cookie, with one of the permutations as default. I appreciate this is far from ideal, but what is the better option? Would I be better using a distinguishing segment for each selection, for example: http://example.com/teacher/lv3/resources/activities http://example.com/teacher/lv4/resources/activities http://example.com/student/lv4/resources/activities etc. What is the most sensible way to handle this situation?"} {"_id": "20718", "title": "Cannot send email to Google Apps alias from PHP", "text": "I'm using Wordpress and sending email via Gmail SMTP: * server: `smtp.gmail.com` * port: `465` I created Google Apps main domain called `domain.com` and email `info@domain.com`. Then I registered domain aliases for `domain.com`. I have more different domains/sites: `domain1.com`, `domain2.com`. When I try to send email to `info@domain.com` from Wordpress site (e.g. `www.domain1.com`), it works. If I try to send email to any alias (e.g. `info@domain1.com` or `info@domain2.com`), it doesn't. I don't know why. For your information, if I try to send email from my Hotmail account it works, I can send to all aliases."} {"_id": "20865", "title": "Additionally, a 404 Not Found error was encountered while trying to use an ErrorDocument to handle the request", "text": "I have a joomla website installed and working good. i installed Qlue404 component which redirects 404 page to a custom page., sometimes when i try the symbols ' and some others the 404 page appears with the message \"Additionally, a 404 Not Found error was encountered while trying to use an ErrorDocument to handle the request.\" also it shows the server information *Apache mod_fcgid/2.3.6 mod_auth_passthrough/2.1 mod_bwlimited/1.4 FrontPage/5.0.2.2635 Server at mywebsite.com Port 80* how can i disable this message? i swear this leads to get all information for a hacker to do an Sql Injection."} {"_id": "57907", "title": "Sudden decrease in pages crawled per day", "text": "During the last few days, Googlebot has crawled much less page in my site than it normally does. I'm not seeing any new errors, and haven't identified a reason for this. Is such a hugh and sudden decrease a sign there is a problem? Can this be just a temporary thing? ![enter image description here](http://i.stack.imgur.com/xU3G2.png) **Update:** The `Time spent downloading a page (in milliseconds)` graph remained steady."} {"_id": "32005", "title": "My cPanel login is being redirected. How can I resolve this?", "text": "I'm trying to get to cPanel to manage my website. When I type `www.mydomain.com:2082` into the browser window, the request seems to be redirected. I made a screen-cast so I could slow down the changes in the address bar. First it seems to go to http://www.mydomain.com:2082/login then http://www.mydomain.com/cgi-sys/login.cgi At this point, a screen briefly appears which says 'Login attempt failed' and then the address is redirected to `https://this22.thishost.com:2083/`, which is no relation to my site at all. This looks to me like there has been an attack on the system and the login.cgi file is compromised. Any suggestions on how to analyze this further? or fix it? Of course my 'free hosting' isn't any help at all."} {"_id": "60774", "title": "Cannot logon to POP server on my VPS or recieve emails", "text": "I recently purchased an unmanaged VPS to host my business websites, however I am struggling to get the email accounts working as I have only ever had experience with shared hosting. The VPS is running CentOS and I have Webmin/Virtualmin installed. I have added my domain, which is lyke.org.uk, and that is working OK. However, when I've added a user and tried to access their email account using Apple Mail, I've been able to establish a SMTP connection but I've not been able to login using POP. Furthermore, I've set up squirrelmail and I can send an email to any email address from there but I haven't recieved any that I have sent to that email address from other accounts. I would very much appreciate any help or suggestions as I am completely new to VPS and web hosting without Plesk or cPanel."} {"_id": "20713", "title": "A standard glyph map for using symbol / dingbat / icon fonts", "text": "I was wondering if there exists or should exist a standard glyph map for icon/symbol fonts. That way if you have a system/theme with thousands of sites running you could easily change styles of the glyphs by changing the font face. For example if I wanted to switch from www.fontspring.com/fonts/drew- wilson/pictos to somerandomdude.com/work/iconic/ a '4' would still represent a right hand arrow. I know there are various reasons for and against using symbol fonts. For my purposes this is a great solution and I have my own methods for dealing with screen readers etc. There is a great article and follow up discussion about this using this method here: http://filamentgroup.com/lab/dingbat_webfonts_accessibility_issues/ and there is a possible standards compliant method here: http://www.codestyle.org/css/at-font-face/StandardUnicodeIconsWebFonts.shtml"} {"_id": "20712", "title": "SEO Score and Ranking", "text": "Does a good SEO Score (over 90% for example) guaranties a better ranking on Google (or yahoo or others)?"} {"_id": "20863", "title": "How to remove 500 error?", "text": "I am getting 500 error \"Internal Server Error\" when I type in my domain. I called customer service and they said that my site has reached it maximum use of memory and therefore the cloud system generates this error. I have never heard of this before. What they said is I can get this error because of either higher: 1. Memory Usage 2. CPU Usage 3. MySQL usage They recommended to tweak the website and remove plugins etc to get rid of this error or upgrade. Things have not changed much here and we didn't do any major upgrade either. What should I tweak if any? Has anyone come across such error. We are hosting with lunarpages and using PHP 4.7 (it is an older site). Would upgrading to PHP 5.3 fix things? Interestingly if I go to `mydomain.com/index.php` it works."} {"_id": "45562", "title": "multiple domains for one site", "text": "I'm maintaining a website of a lodge which has a golf course, a spa and a restaurant and they all have seperate urls. Now, the main Lodge page contains all the content and for now, the other URLs just forward to the specific pages e.g. spa.com forwards to www.lodge.com/spa. Now I assume that this is not the optimal way to make use of these additional domains... now, how do I best go about this? I thought about linking from within my lodge.com page to spa.com as soon as people click on the spa link but then again if someone looks for a lodge with spa e.g. it would rank lower as the spa would appear to be seperate from the lodge... how do i best go about this?"} {"_id": "13913", "title": "Tabbed-Menu where tabs Change Color?", "text": "I need a tabbed-menu template where the tabbed menu for the present page **changes color**. If possible, it should play nicely with browser that don't have JavaScript enabled. Other then the tabs, I prefer a simple blank page without complicated CSS."} {"_id": "67558", "title": "How can I add my website to DMOZ?", "text": "I have a new website that I tried to list in DMOZ, but as of now it's still not listed. Please tell me what I have to do so that my website will get listed in DMOZ."} {"_id": "25560", "title": "Does the google crawler really guess URL patterns and index pages that were never linked against?", "text": "I'm experiencing problems with indexed pages which were (probably) never linked to. Here's the setup: 1. **Data-Server** : Application with RESTful interface which provides the data 2. **Website A** : Provides the data of (1) at http://website-a.example.com/?id=RESOURCE_ID 3. **Website B** : Provides the data of (1) at http://website-b.example.com/?id=OTHER_RESOURCE_ID So the whole, non-private data is stored on (1) and the websites (2) and (3) can fetch and display this data, which is a representation of the data with additional cross-linking between those. In fact, the URL /?id=1 of website-a points to the same resource as /?id=1 of website-b. However, the resource id:1 is useless at website-b. **Unfortunately, the google index for website-b now contains several links of resources belonging to website-a and vice versa.** I \"heard\" that the google crawler tries to determine the URL-pattern (which makes sense for deciding which page should go into the index and which not) and furthermore guesses other URLs by trying different values (like \"I know that id 1 exists, let's try 2, 3, 4, ...\"). Is there any evidence that the google crawler really behaves that way (which I doubt). My guess is that the google crawler submitted a HTML-Form and somehow got links to those unwanted resources. I found some similar posted questions about that, including \"Google webmaster central: indexing and posting false pages\" [link removed] however, none of those pages give an evidence."} {"_id": "31832", "title": "Do private collaboration platform really need private file access?", "text": "I need to make private collaboration platform, where the website is not open to public registration, and all the posts is only accessible by the members. The members is management team of a company. Along with many features, it has Announcement. When posting Announcement, it has option to upload images. I personally like it public, because the file transfer faster and can be easily integrated with CDN. I just wonder whether I need to make the image public, or only accessible by the members."} {"_id": "36367", "title": "Can I improve my AdWords quality scores with better landing pages?", "text": "I noticed that I have some keywords in my AdWords that are totally applicable to my site but the quality score of the keyword is 4 or 5. I'd like to get it up higher by creating custom versions of my site's home page (landing page) targeted specifically for people searching on those keywords. So for example, if we pretend my site sells pet food, my current home page has the phrase \"dog food.\" I have a specific AdWords campaign for people searching on cat food (with cat food-specific ads). I'm thinking about changing the URL on those ads to something like http://mysite.com/cat.html, so a different home page comes up with the phrase \"cat food.\" My thinking is that will help Google see that this new landing page is appropriate for the keywords and will raise my quality score for the \"cat food\" keywords. (Note that none of what I'm doing is shady or misleading; nobody would disagree that all of the keywords and ads I've created are perfect and appropriate for what my site offers.) Question: is what I describe the correct way to raise poor quality scores on keywords, and will it help?"} {"_id": "36368", "title": "Cache website via CDN for visitors, not for users", "text": "I'm not sure what this is called. For example, if you're logged out of Reddit, you see a cached, static version of the site from the CDN. When you're logged in, you don't hit the CDN and instead directly to Reddit's servers. What is this called? Which CDNs provide this service?"} {"_id": "25569", "title": "Problem with displaying images on Facebook from my wordpress", "text": "When posting link to my wordpress geekgirlscarrots.pl on Facebook, thumbnail images don't display at all. I've put at header the following code: <meta name=\"title\" content=\"Geek Girls Carrots\" /> <meta name=\"description\" content=\"Visit our site! Come to our meetings!\" /> <link rel=\"image_src\" type=\"image/jpeg\" href=\"http://geekgirlscarrots.pl/wp-content/uploads/2012/02/1GGC_for_web_logo.jpg\"/> but it doesn't work. If anyone could help me?"} {"_id": "56946", "title": "How do you use 'site:' to search Google for just the domain without all the other subdomains appearing?", "text": "When I use the `site:` operator to search for domains, it returns all the other matched subdomains as well. However, I only want to see what's indexed for the main site. For example: `site:google.com` returns result of `abc.google.com` as well, but I only want to see what's on `google.com` I don't want to use `site:google.com -site:abc.google.com` since I don't want to enumerate through all the other subdomains. Is there a way to do \"exact\" match with the `site:` operator?"} {"_id": "60074", "title": "General HTTP error: Domain name not found", "text": "I have verified my domain through a TXT record by google, however, my site hasn't indexed yet and I face with sitemap errors in google webmaster tool. I run dns health check from http://dnscheck.pingdom.com/ ,and everything seems ok. But as you can see from the snapshot below webmaster gives `General HTTP error: Domain name not found` error. I will appreciate for any help and thanks anwyway. EDIT: My site is buraktas.com ![enter image description here](http://i.stack.imgur.com/lU8nw.png)"} {"_id": "69096", "title": "Google wont apply adsense policies for session based websites and Ajax loaded sites?", "text": "I have seen so many websites (Mostly User generated content - Getting contents from Third party API's) still displaying Adult like contents (with images) but still they are running well with Google Adsense without any issues. Below is url that displays the Adult contents (you can view only if you logged into the Instagram.) http://www.gramfeed.com/instagram/635186136754514912_477290089 I wont say this is a bad or Adult site. Because of user generated content, the page contains Adult content (Even you can see the F..k word too). But they are displaying Google ads. My Questions here is, If a website main content loaded via Ajax (but not the Advertisement, Header and Footer). So Google adsense or Google can not able to find violation in the page. Is this correct? If a website displays main content only by visitor login (but not the Advertisement, Header and Footer), So Google crawler can not able to find violation in the page. Am i right in the above points?"} {"_id": "57489", "title": "Best Single-Sign-On practices for large organization", "text": "I'm currently working on several IT projects for a large organization (100k+ members). I'm thinking on providing some kind of unification and integration of all websites that I'm designing. So my question is, what are the best practices for such integration? I'm using mainly two `php`-based engines: WordPress and Q2A What I would like to achieve: * Provide Single-Sign-On (SSO) for all websites managed by me. * Exchange information between pages, something like user's top-bar on StackExchange * Provide a SSO plugin for WordPress (there is a lot of sub-organizations in this organization, most of them have simple WordPress based webpages; I would like to provide them with a plugin that allows loging by the users via unified SSO). So what are the best practices? Is the SSO mechanism itself powerfull enough to achieve that or should I look for some modification of it?"} {"_id": "56766", "title": "How does Google treat multiple page title matches from same site?", "text": "I have several non-duplicate page titles that match a search query, say for the query 'stackexchange': <title>John Smith Popular Content John Smith Audience John Smith Friends John Smith Analysis I assume Google probably displays the most page-ranked page that matches the query, but does Google also take into account that this `page1` links to `page2`and `page3` which also have the search query in the title and therefore gives `page1` a better rank because my site also containts `page2` and `page3` about `searchterm`? The reasons I'm asking is because I'm trying to improve my rankings for SE queries such as `stackexchange`, and considering loading `page1` / `page2` / ... through JS while keeping the same URL. I assume this would mean Google won't even know what content `page2` has, so trying to understand his handling of multiple matches from same site better."} {"_id": "32006", "title": "How to redirect a website from a folder to the root folder?", "text": "I was running a website on a folder (www.example.com/folder). The new version of the website is live and runs on the root folder. How should I redirect users who visit the website using the old URLs (either because they bookmarked it or through search engines)? And in the long term, can I 'clean' the search engines results by removing the old links that show up?"} {"_id": "49159", "title": "Do web hosting companies allow server-side scripts to make external data requests?", "text": "I'm currently ready to deploy an asp.net website and looking at web hosting companies. The C# code behind periodically uses `HttpWebRequest` to pull data from an external website to insert into a SQL database. This is only a server script and clients won't have any affect on when or how it runs. This my first time deploying a site, so my question is.... do web hosting companies allow external http requests? Is there anything specific I should be looking for to ensure this will work when deployed?"} {"_id": "23043", "title": "What meta/link html tags important for seo?", "text": "> **Possible Duplicate:** > New META TAGS with positive effects for seo ranking in 2011 and beyond What are meta tags important for seo? Especially for blogs. I often find something new, maybe you will correct me and add something else? meta name = description - description of page http-equiv = Content-Language content = \"ru\" - language of page http-equiv = \"Content-Type\" content = \"text / html; charset = utf-8\" rel = \"alternate\" type = \"application / rss + xml\" - Link for RSS-chanel Another thing I remembered but had forgotten indication nofollow / follow, index / noindex on the page. Somebody remind me? I also saw a Wordpress rel = next / previous page is important for SEO? Something else that I do not have?"} {"_id": "6184", "title": "How do I force a www subdomain on both HTTPS and HTTP?", "text": "For whatever reason I can\u2019t seem to get this right. I\u2019ve looked at many examples on here and on the Apache website. I\u2019m trying to force `www.example.com` instead of example.com on **both** HTTP **and** HTTPS, but I am not trying to force use of HTTPS instead of HTTP. The following code seems to work for all HTTPS connections, but will not cause a redirect for HTTP connections. RewriteEngine On RewriteCond %{HTTPS} on RewriteCond %{HTTP_HOST} !^www\\.example\\.com$ [NC] RewriteRule ^ https://www.example.com%{REQUEST_URI} [R=301] RewriteEngine On RewriteCond %{HTTPS} off RewriteCond %{HTTP_HOST} !^www\\.example\\.com$ [NC] RewriteRule ^ http://www.example.com%{REQUEST_URI} [R=301]"} {"_id": "49150", "title": "What happens to my website when domain name expires", "text": "I am new to this. Someone just created my website for me for free, and all I did was to pay for the domain and hosting. My website is new with just around 4 posts. I cannot complete it because I am not knowledgeable enough and I don't have much time. Also the person who I paid to complete it didn't show up. I decided to let my domain name expire, but now I found out that anyone can sell their domain name or website. Can I sell my domain name, and what will happen to my website or the contents therein? If I sell my domain, does that mean the ownership of my website and its contents there are also transferred to the person?"} {"_id": "51920", "title": "What's the difference between nofollow and canonical?", "text": "I'm confused about the difference between `rel=\"nofollow\"` and `rel=\"canonical\"`. I currently use nofollow for any links that I do not want followed (for example rubbish content but small importance) and I use canonical for duplicate content (for example: `http://example.com/price.php- sortby=price` vs `http://example.com/price.php-sortby=alphabetic`). Maybe I'm using these wrong. I need good advice on what's the best way to use **nofollow** and **canonical** correctly."} {"_id": "42310", "title": "Upgrading phpBB2 to phpBB3: easy or difficult?", "text": "I recently took over a site which is mostly a phpBB version 2 forum for a niche audience, surrounded by a few fairly static pages written in PHP. Our forum user base is about 3000 users, with a hundred or so forum posts per month. We have modified a few of phpBB2's source code files from the standard issue. Now phpBB version 2 is no longer supported as of mid-2009, though alternate phpBB2 support communities exist. The current version is phpBB version 3. The outgoing webmaster said that they had decided against upgrading to to phpBB3 software because the expected it to be a difficult task with little feature benefit. What is the general experience with upgrading a phpBB2 site to phpBB3? Is the experience generally difficult, or generally easy? phpBB3 provides a conversion path for phpBB2 sites; how well does this work? Let's leave the question about the _benefits_ of upgrading to a separate discussion. Let's just talk about difficulty."} {"_id": "28547", "title": "404 Not Found Errors -> Redirect", "text": "Hi i have recently moved my forum from Mybb to Vanilla, so the index pages such as showthread.php now do not exist. This is a big problem for SEO, what do I do to redirect this to the home page?"} {"_id": "42315", "title": "cPanel Distorted w/ 401 Error on CSS/Images", "text": "This is very strange behavior. My cPanel is looking very distorted when I log in from mydomain.com/cpanel but is fine if logged in through /cpanel. When I use Chrome to check for errors I see some CSS and images are getting the 401 error. Below is a screenshot of my cPanel. I called tech support and they weren't able to answer my question and help me fix the situation! Not sure if CloudFlare has anything to do with it, but I did install it three days before this post was up. I went into CloudFlare and make sure cPanel is bypass. Distorted CPanel Bump! Will put out bounty soon!"} {"_id": "23123", "title": "SEO considerations for multilingual websites", "text": "> **Possible Duplicate:** > How can I get search engines to crawl my site and see a localised view of > my data? I'm developing a website that will be translated into two languages: English and French. I'm just wondering what the best practices for such a site would be when it comes to SEO? So far, the website has two root sub-directories, which means URLs look as follows: * `http://www.example.com/en/home` * `http://www.example.com/fr/accueil` And in my `` tag I specify the page language using the `lang` attribute (depending on which sub-directory we're in, i.e. ``). Will this be enough for the site to rank and be indexed correctly in local versions of Google, i.e. English pages to achive the best ranking they can on google.co.uk and the French content to achieve the best rankings it can in google.fr?"} {"_id": "53330", "title": "Do GET variables have bad effects on a page?", "text": "I am developing my own site, and I want to add multilanguage to my site. My approach is to have a GET variable `lang=LANG` to handle different languages. So, my URLs will be as such: mysite.com/index.php?lang=EN mysite.com/index.php?lang=AR ... Do Google consider these as a single page? Do they have one PageRank? As I know \"canonical\" is a solution to such issue, but I read on this site that Google will still consider them as different pages. Here is what I found: seo and php get variables"} {"_id": "3786", "title": "Multiple Language Website and SEO", "text": "> **Possible Duplicate:** > How can I get search engines to crawl my site and see a localised view of > my data? Sorry, this post became a bit loquacious - but I haven't found any satisfactory answers on the web thus far. I've been developing a website for a while now, into which I've built support for multiple languages. All the terms, words and phrases are stored in a file in JSON format which is decoded and converted into an array. With this, it's easy to create the entire site in a different language. The website consists of 100% home-cooked code... no Wordpress, Drupal or any other framework, so none of the multi-language plugins for the aforementioned will work. To switch language, the user clicks a link at the top of the page (positioned at the top, in reality, it's at the bottom of the HTML structure). Each page of the site has this menu, whereby a query string parameter is added. `/hotels/index.html` becomes `/hotels/index.html?lang=fr`. The script sees the `lang` parameter, changes the user's cookie and loads the appropriate JSON file, and they are returned back to the same page they were on without the `lang` parameter. This has been a very convenient way to switch languages, as the user remains on the same page, and the links are exactly the same. I built it this way after another site that required multiple language support where SEO was not a priority (an internal company-only program). If a visitor comes to the site for the first time, the script looks at the `HTTP_ACCEPT_LANG` constant and loads the appropriate language and sets the cookie (instead of geo-locating, which takes longer, and I personally dislike - I live in Thailand, and even Google serves the Thai version even though I have my OS and browser in English). My question is - I'd eventually like Google and other SE's to index the site in these other languages, but without the `lang` attribute in the URL. If the user uses Google in French or Russian, they get served the site in French or Russian, respectively. Since the URL `/hotels/index.html` would have a version in English (default), French and Russian, but still having the same URL, I assume this is a problem for Google and other SE's. Can someone explain how Google indexes multiple-language sites setup in this manner? Would using a sub-domain work best to separate them? I do not wish to alter the way URL's are handled on the site, as I like to keep as few query strings as possible from being indexed. On another note, I accidentally opened the site for indexing with the language links, and somehow, the redirects did not work properly, and I have links such as `/something/index.html?lang=ru?lang=ru` indexed in Google. I've detected these in the script and placed a 301 redirect to remove these `lang` parameters... will these bad links be replaced with the redirected URL's in Google over time?"} {"_id": "23117", "title": "SEO problems with multilanguage site", "text": "> **Possible Duplicate:** > How can I get search engines to crawl my site and see a localised view of > my data? I have recently started a CMS project in PHP, the site will be multilingual. Assumptions are as follows: * 2 languages in the beginning (pl-pl, en-gb) * language auto detection based on browser language * for all other languages than the above, fallback is en-gb * site will have links to change language, but they will be shown as AJAX / JS widget! I know from my experience that sometimes there are problems with correct Googlebot indexing for home page. My question is whether Googlebot is able to index both language versions of the website, and whether I should use subdomain (`en.webpage.com`) or PHP get (`webpage.com/index.php?lang=EN`) to see the best results."} {"_id": "23112", "title": "URL strategy for multilingual website", "text": "> **Possible Duplicate:** > How can I get search engines to crawl my site and see a localised view of > my data? I was wondering what would be the best URL strategy for a multilingual website (for an Italy based company). I was thinking about site.com/it and site.com/en, not showing any content at site.com and redirecting the visitors to the localized site when they try to access site.com. What about buying an additional domain, site.it, and then redirecting to site.com/it? Are these good ideas? If so, **regarding of SEO** , which are the best redirect methods for both site.com (to site.com/en or site.com/it) and the additional (if available) site.it (to site.com/it)?"} {"_id": "2724", "title": "Options for adjusting paragraphs?", "text": "On my company's website, we have a page (here) that I have been asked to adjust. What is desired is to take the first paragraph and adjust it to trim down the amount of space between words. My first thought was to make the paragraph (and then the rest of the page, to match) left aligned instead of justify. To me, however, this doesn't look very good. Are there any other options available for me to make this look good (or at least acceptable), or am I stuck with a left justify? Thanks. (P.S. I don't think these tags are the most appropriate, but I can't add ones I think work better. Oh, well.)"} {"_id": "51922", "title": "Keep a Servlet/JSP web site always on SSL?", "text": "How to I keep a site always on SSL with padlock symbol always exists. Is there any way to optimize the page for SSL in JSP and Servlets ? Just tell about jsp and servlet optimization so answer do not make long discussion"} {"_id": "55233", "title": "Will echoing content from a simple XML loaded with PHP cause the page to show up in search results (SEO)?", "text": "I want to know if I'm using PHP right to create search friendly content. I'd like my images to start showing up in Google for my website, so I'm passing an RSS feed to PHP file using the `simplexml_load_file` command and writing content to the page with `echo`s. $rss = simplexml_load_file('~~RSSfeed~~'); foreach ($rss->channel->item as $item) { echo '
  • '. $item->description .'
  • '; } Will displaying a list of images and text (there's other formatting and divs that I didn't include) make this content show up in search results?"} {"_id": "45323", "title": "Small AdWords campaigns with low budget", "text": "Having a Google AdWords campaign of a daily budget less than 10\u20ac ($13, \u00a38.5), I had these: **Setup 1** * (1 campaign with 100% budget) x (3 ad groups) x (2 to 3 ads) = 40-60 clicks/day * long list of keywords per ad group * budget ends up being spent on the ad group with the highest traffic/competition **Setup 2** * (3 campaigns with 1/3 of budget) x (3 ad groups) x (3 to 4 ads) = 20 clicks/day * less keywords per ad group but more closely related to the ads' purpose and phrasing, which summed up match and extend the keyword lists of Setup1 * all around performance dropped and in some cases a lot (i.e. clicks) * * * **Questions** In specific to small campaigns, which I feel are not addressed as much by most guides I found around the web: 1. My assumption for the above is that the **main** reason for the drop in performance of Setup2 is the extremely low budget. Even though generic and simplistic, is this correct? 2. How can I best manage the budget, so that it is spent more universally or up to a specific limit for each ad group? 3. Finally, does anyone have any suggestions specific for campaigns of this level?"} {"_id": "45322", "title": "Server requirements for thousands of concurrent connections?", "text": "I was wondering if anyone could provide some insight into the approximate server requirements (CPU, RAM, etc) for thousands of concurrent connections to a small text file. The actual number of concurrent connections will vary, but it could be as high as 100,000 or more per second at times. The file that will be accessed is a very small text file (a few KB)."} {"_id": "48201", "title": "Content blocks on dedicated pages rather than listed on one page", "text": "I am doing a web site for a client. The descriptions for all his services are rather short and precise so I decided to put them all (about 10 of them) on one page and have a sticky menu in the sidebar for quick access to each of the services. I find that a very usable experience. And for mobile (through the responsive approach), swiping through the services is more convenient than navigating to each service on a dedicated page. But now an SEO guy comes in and tells me to put all services on dedicated pages, and the client then also has to transform all the about 60 words descriptions into 400 word descriptions, because Google demands 400 words to be on a page to be a page. I my opinion this is a major step back in regard of accessibility/usability. And stuffing the descriptions with 'nonsense' just for the sake of the presumed SEO advantages? I really don't think so. I am about to tell the client, that the SEO guy might not be right about that situation, but I am not sure, so who is right?"} {"_id": "25373", "title": "How can I prevent spam on sites which I control?", "text": "_This is a general, community wiki question to address all non-specific spam prevention questions._ _If your question was closed as a duplicate of this question and you feel that the information provided here does not provide a sufficient answer, please open a discussion onPro Webmasters Meta._ * * * For purposes of this question, spam will include: * Any automated post * Manually-posted content which includes links to spammers' sites * Manually-posted content which includes _instructions_ to visit a spammer's site"} {"_id": "24960", "title": "Stopping comment spam with links(need suggestion)", "text": "> **Possible Duplicate:** > How can I prevent comment spam on sites which I control? After bearing 1 year spammy comments with links in those in my 10 sites, finally I've disallowed comment posting with prefix of \"http\" or \"https\" ( with message that I'll add prefix http if needed). I'm happy with the results. But could it be hurting site or site visitors in any way? Any ideas?"} {"_id": "25803", "title": "Moderator with no power to fix spam bot issue", "text": "> **Possible Duplicate:** > How can I prevent spam on sites which I control? I think this is the correct site to post this. So I'm a **moderator** (not administrator) on anddev.org and it's completely infested with spam bots, just as fast as I ban them, more post. I'm sure they can sign up faster than I can ban them. There is a two stage first post sign up validation that they have clearly got around. I'd also like to delete all posts by that user when they are banned. (which I can't see an option for) Is there anything I can do? _EDIT_ This was closed for the wrong reasons: **I'm just a moderator not an administrator** I can't do any of the things in that wikipost. I wanted to know what I can do as a Mod, or how I can find more options, or even what the software is I am moderating!"} {"_id": "56558", "title": "Major spam issue... CAPTCHAs and email verification aren't working", "text": "We have a user who claims to have a vendetta against our community. We run a fairly large social networking website, and for the past four or five weeks, this particular person (or maybe people?) has been spamming our largest discussion groups and personally attacking other members by leaving hateful comments and death threats on their pictures. We employ CAPTCHA and email verification on signup. We also have cross post analysis that catches users mass-posting a single thing to a lot of groups. It seems as if he has hundreds of legit emails (they're all gmail, hotmail, yahoo) and looking at our logs, it seems as though he's sitting there and legitimately solving these CAPTCHAs. For hours. Every single day. When he caught on to our cross post analysis, he started generating random titles and bodies to post. He'll hit a dozen of our top groups with those over and over again until we delete his user account or he gets banned by a moderator. His IPs change constantly. Many are Tor nodes, some are vanilla HTTP proxies. His user agents change for every new user he signups. Sometimes they're even \"GoogleBot\", which I thought was amusing. So my question is, are there best practices we could be following besides employing CAPTCHA and email verification? Ones geared towards an extremely motivated human generating spam?"} {"_id": "22892", "title": "PHP software to detect spam?", "text": "> **Possible Duplicate:** > How can I prevent spam on sites which I control? I'm looking for something like spam-assassin or similar, except it only focuses on the message rather than inspecting an entire email header. It doesn't even need to be really fancy, just do the classification of messages. In theory we could provide meta-data such as IP. I'm doing this because I want an feedback forum without authentication or recaptcha or having to put every email in the \"pending\" bin before it's published."} {"_id": "4202", "title": "Tons of spam on phpBB even with recapcha enabled", "text": "> **Possible Duplicate:** > How can I prevent spam on sites which I control? I am running a phpBB forum where the requirements were that users shouldn't have to sign up in order to post. They could just give a name and post as long as they used recapcha before posting. Problem is that there are like 30 pages of spam in some threads. The software is up to date. The forum gets very little legitimate traffic. If I can't fix this I will either have to require registration or remove the forum all together."} {"_id": "47995", "title": "How to prevent common spam on a WordPress blog", "text": "Right from the day I installed WordPress on my site, I started getting spam comments/trackbacks. I have also prevented anonymous comments. I get the following email whenever a new spam is posted: New trackback on the post `\"\"` is waiting for your approval `` Website : jackettl36 (IP: 112.111.173.89 , 112.111.173.89) URL : Trackback excerpt: fake OAKLEY sunglasses... ... How to prevent such automated spam?"} {"_id": "8046", "title": "Recommended Spam Best Practices", "text": "> **Possible Duplicate:** > How can I prevent spam on sites which I control? Are there any recommended admin tools or interfaces for reviewing and moderating spam on a community-driven UGC site?"} {"_id": "28225", "title": "Does anyone run a site with Pligg? What is the most effective way to stop spam sign ups?", "text": "> **Possible Duplicate:** > How can I prevent spam on sites which I control? Specifically trying to combat things like this! http://www.autopliggdotcom (didnt want to give them a link!)"} {"_id": "53823", "title": "Is there a way to control spam on a site where user contribution is allowed without accounts?", "text": "I am in the process of designing a database and accompanying website where users can submit local dance events. Part of the site will be the posting of new events but also the vetting of current events. Users will be able to flag event info as correct or incorrect, flag info for review, and mark an event as interested/attending. Is there a way I can mitigate the amount of noise and spam since the site will be public and not have accounts?"} {"_id": "719", "title": "Free spam blocker service", "text": "> **Possible Duplicate:** > How can I prevent spam on sites which I control? Is it possible to get a free and good spam blocker for your website? I would really like to avoid having to serve CAPTCHAs to my site's users, but I don't want spam bots to be able to post links etc."} {"_id": "25586", "title": "Spammers in my Magento blog comment section!", "text": "> **Possible Duplicate:** > How can I prevent spam on sites which I control? I have a blog extension installed on my Magento website. But I regularly receive spam comments on the blog. To help fight against automated and manual spam comments on the blog I have enabled a Captcha service. But even with the Captcha service running the site still receives spam, is there a more effective method to avoid spam?"} {"_id": "194", "title": "Effective captcha solutions", "text": "> **Possible Duplicate:** > How can I prevent spam on sites which I control? ReCaptcha is all the rage at the moment, Based on customer/user feedback, what other captcha or general human verification tools have you used for your website? Links to the API/Site appriciated, Thanks guys!"} {"_id": "45327", "title": "Can I create test accounts in Google for testing Oauth in my application?", "text": "My app contains Sign in from Google function that uses Oauth. To test it I'd want to create test accounts at Google. But Google doesn't allow me to create ordinary accounts without entering captcha so I can't create ordinary accounts for testing. Can I create test accounts in Google to test Oauth in my application?"} {"_id": "2729", "title": "How much should a newbie charge a client for web design work?", "text": "I was asked to design a website for a client. Personally, I'd work for free since I'm just in it for the experience of working with a client but they insist on paying me so I will get an idea as to how freelancing works and for them to know how much they should pay for future services. I have very little experience when it comes to real work since I'm still a student but I've had internships as a web designer/developer in some software companies here in my area. I can build decent web pages using HTML, CSS, Javascript/jQuery, Java/JSP/J2EE and SQL. I am also familiar with some CMS like WordPress and Joomla. How much do you think should I charge my client if I were to design a simple static but somehow functional website for them and how should I charge them for my services? I forgot to mention that I am located in the Philippines. Since it's a 3rd world country, I'm pretty sure the rates here are lower than most areas."} {"_id": "33032", "title": "How to handle URLs with diacritic characters", "text": "I am wondering how to handle `URLs` which correspond to strings containing diacritic (`\u00e1`, `\u01da`, `\u00b4`...). I believe what we're seeing mostly are `URLs` where diacritic characters where converted to their closest `ASCII` equivalent, for instance `R\u00e5nades p\u00e5 Skyttis i \u00d6-vik` converted to `ranades- pa-skyttis-i-o-vik`. However depending on the corresponding language, such conversion might be incorrect. For instance in `German`, `\u00fc` should be converted to `ue` and not just `u`, as seen with the below `URL` representing the `Bayern M\u00fcnchen` string as `bayern-muenchen`: http://www.bundesliga.de/en/liga/clubs/fc-bayern-muenchen/index.php However what I've also noticed, is that browsers can render non-`ASCII` characters when they are percent-encoded in the `URL`, which is the approach `Wikipedia` has chosen, for instance `http://de.wikipedia.org/wiki/FC_Bayern_M%C3%BCnchen` which is rendered as: ![enter image description here](http://i.stack.imgur.com/gvRTK.png) Therefore I'm considering the following approach for creating `URL` slugs: -(1) convert strings while replacing non-`ASCII` characters to their recommended `ASCII` representation: `Bayern M\u00fcnchen` -> `bayern-muenchen` -(2) also convert strings to `percent encoding`: `Bayern M\u00fcnchen` -> `bayern_m%C3%BCnchen` -create a `301` redirect from version (1) to version (2) Version (1) `URLs` could be used for marketing purposes (e.g. `mywebsite.com/bayern-muenchen`) but the `URLs` that would end being displayed in the browser bar would be version (2) `URLs` (e.g. `mywebsite.com/bayern- m\u00fcnchen`). **Can you foresee particular problems with this approach? (Wikipedia is not doing it and I wonder why, apart from the fact that they don't need to market their`URLs`)**"} {"_id": "9192", "title": "What are some good Apache settings to use with wordpress?", "text": "My server has 756m of RAM. Every few days I get [error] server reached MaxClients setting, consider raising the MaxClients setting"} {"_id": "39809", "title": "Better Image Hosting for MediaWiki", "text": "We're using MediaWiki and writing a lot of articles / tutorials that require images / screenshots. Image management in MediaWiki is painful. I was hoping I could use something like Dropbox to store all the images we need and then share them via external links. However, public folders aren't sharable and I can't make shared folders public. Is there something like Dropbox where I can easily just drop the files in a folder, they sync with all my colleagues, and I can get static URL's to the files to put on MediaWiki?"} {"_id": "7233", "title": "What terms/conditions should I have in place for a site popular with children/teenagers?", "text": "I'm working on a site with a forum, for which the current user base is fairly young. For the forum, where do I stand legally with children/teenagers signing up? For example, I believe Facebook restricts membership to 13+. Should I have this restriction too? (We don't really have the means for any kind of age policing.) Are there any standard terms/conditions I can add to the site?"} {"_id": "7942", "title": "Set goal in funnel not to be neccessary", "text": "I have a start page which can link to a discounts page and then three possible extras pages (one of which will definitely be reached) and all three of these lead to a confirmation page. I have set up goals on each page and on the confirmation page I have the following funnel: Step 1: /G1/StartPage Step 2: /G1/(Extra1|Extra2|Extra3)Page However I want to include the discounts page, but not all customers can see that page. e.g. Step 1: /G1/StartPage Step 2: /G1/DiscountsPage (may not be read) Step 3: /G1/(Extra1|Extra2|Extra3)Page Can I do this in the same funnel, or do I keep my current funnel above, and a create a separate funnel (of which the goal is the confirmation page again) and only include the discounts page?"} {"_id": "22303", "title": "Why should we use tags like p, span, hx tags when we can use CSS instead?", "text": "Why should we use tags like p, span, hx tags when we can use CSS instead? Is it important from an SEO point of view?"} {"_id": "10373", "title": "Is there a way to save MS Word document as HTML w/o the ms proprietary stuff?", "text": "So normally I wouldn't use this feature (\"Save as Web Page\") but I have large documents from clients they just want put on their site as HTML, and formatting it all by hand seems like a waste of time. I have tried \"save as webpage\" in Word 2007, but it produces all sorts of bad stuff. To wit: as well as a large block of XML formatting info: In the above I have split the script tags so that the top part can appear on every page and then the bottom part just appears on the order confirmation page. I have tried combining into a single script tag but this didn't solve the problem. Could the hyphen in the SKU or the number of decimal places in PRICE or QUANTITY be causing a problem? Or is there just a typo that I can't see? Any help appreciated! John"} {"_id": "39389", "title": "Is it good to have a separate search results page for a website?", "text": "In terms of SEO and maintainability. My website is based on WordPress, it's currently using WordPress search and it's using the same URL as the home page to return search results i.e. say my website is at `domain.com` then the search results are at `domain.com/?`"} {"_id": "51934", "title": "Structured data: Is it okay to have more than one image, description and SKU per page?", "text": "In a few pages in a website, similar products are displayed in the same page. Each item has its own description (almost identical) and its own SKU (also almost identical). How can I use microdata correctly here? I have tried, for testing purposes, to add the code for image, description and SKU for all 3 items in one page, and I have tested the page in Google's Structured Data Testing Tool. Google read the data and displayed the URL for all 3 images, all 3 descriptions, and all 3 SKUs. Is this the correct approach, or is this wrong? Not sure if it makes a difference, but at the moment, this is not an e-commerce site.
    Test name 1 Description test 1 #001
    Test name 2 Description test 2 #002
    Test name 3 Description test 3 #003
    "} {"_id": "42385", "title": "Network unreachable: robots.txt unreachable", "text": "I am trying to add a valid sitemap to Google Webmaster. Yet, it says: > Network unreachable: robots.txt unreachableWe were unable to crawl your > Sitemap because we found a robots.txt file at the root of your site but were > unable to download it. Please ensure that it is accessible or remove it > completely. and > Network unreachable: robots.txt unreachableWe were unable to crawl your > Sitemap because we found a robots.txt file at the root of your site but were > unable to download it. Please ensure that it is accessible or remove it > completely. Yet, I can access both my robots.txt and sitemap.xml. I have reading other posts here and there, but could not solve/understand what is causing this issue. Anyone knows?"} {"_id": "56086", "title": "Tomcat WebDAV root directory", "text": "Is it possible to configure WebDAV in such a way that it serves content outside the root directory? I basically want to serve content via WebDAV from an absolute path `C:\\Content`. (A third party application will drop files in that directory.) I'm running Tomcat 7.0."} {"_id": "50330", "title": "What does Analytics '% of Total' mean?", "text": "I feel this may be a stupid question but I don't understand this: Under total visits I have around 6,000 which is \"% of total 75.58% (8,000)\". What does this mean and why doesn't it just list 100%?"} {"_id": "32581", "title": "Question about paginated based URLs and category URLs with XML sitemaps", "text": "I have a blog site that has this URL structure: http://website.com/category/ http://website.com/category/page1/ http://website.com/category/subcat/ http://website.com/category/article_title ### A flow could be like this: Visit the website > Get to a category > Filter by sub category > Click on article The category URL's show 12 of the newest posts, then there are paginated until they display the rest. The sub categories do the same thing. **My question is:** Should I include the category, paginated, and sub category links? Or is the article title good enough for the sitemap? *EDIT: Since /category/ and /category/subcat/ are the same content, I use a canonical URL on the subcat URL to point to its parent."} {"_id": "55785", "title": "In WordPress, should I \"Allow search engines to index this site\" during development?", "text": "I am going to create a web page, and I will do it online. Should I choose the option: \"Allow search engines to index this site.\" during development, or should this be opened after development ends?"} {"_id": "59923", "title": "eCommerce checkout security", "text": "I'm developing an eCommerce site and using SagePay as a card processor. Customers will enter their delivery/invoice details into a form which is then submitted to a checkout page. This page encrypts the data and sends it over to SagePay to process the payment. No credit card details are entered or stored on my site, this is all handled by SagePay. The address details only are stored in my database. My question is what sort of security do I need to implement ? As I see it, my options are: * Use an SSL to provide a secure site * Encrypt the address details before insertion into the database (I'm intending to do this anyway) * Do nothing For reference, the site is written in Coldfusion."} {"_id": "16838", "title": "If my page titles are dynamically created with PHP will Google be able to read them?", "text": "My page titles are dynamically created with PHP: I set my page title in my PHP functions, and when the function is triggered the title is set. Is Google able to read the page titles?"} {"_id": "32584", "title": "How to reliably determine server stack of a site (eg .Net vs Open Source)", "text": "I've been trying to figure out how to determine the use of the .Net stack compared to Open Source OS's. I have read that Netcraft.com is one place to find out. But when I put in one of my own sites (which is .Net), it comes up as unknown, and when I put in StackOverflow.com, it comes up as a Linux OS and \"unknown\" server. I thought SO was .Net MVC (has that changed)? Given that Netcraft came up with wrong/incomplete answers in these instances, do you think their statistics overall are still reliable? http://toolbar.netcraft.com/site_report?url=http://stackoverflow.com"} {"_id": "32587", "title": "How can I remove the security/malicious user warning from my website?", "text": "I have a domain name tradespring.net, and www.tradespring.net that redirect to my Heroku app with a CNAME record. However when I first try to access these sites it gives me a malicious warning. > This is probably not the site you are looking for! blah blah blah then > \"proceed anyways\" or \"back to safety\" Its because my browser realizes that it is redirecting. How can I make sure anyone's browser (not just my browser) trusts this site and my Heroku app? I don't think i need an SSL certificate because this site is not sending sensitive info (credit card info, etc.)."} {"_id": "54176", "title": "Do we need to block repeated pages content for SEO relevance", "text": "I have multiple purchase pages with the same content like: product1red.php product2green.php Should I block them with robots.txt ?"} {"_id": "42994", "title": "I suspect our agency artificially increased traffic", "text": "I had a first suspicion when I compared our previous CPC campaign, managed in house with the one who was done by agency. In previous campaign there was no traffic increase from organic traffic, while in the campaign managed by agency there was a large and correlated increase of traffic from both - CPC and organic traffic. Look at the image. I have looked into various things in Analytics, but the strangest thing was appearance of CPC searches with plus sign, like +very +best +product. Now I doubt that there are lots of people doing search like that. Also, here was significant increase of average duration and pages per visit. We did some other marketing activities at that moment, but it looks like a very unlikely coincidence. Please advice how can we bust them, because my boss is about to sign up a contract to do SEO with them..."} {"_id": "6751", "title": "What analytics software to use for small Apache served website with banners not from Google", "text": "I'm planning to run a small website using Apache (and other) server(s). The website supports configuring different banners for different pages dynamically. The banners are not from Google but \"manually chosen\". I don't expect too much traffic but still some, say 1.000 page views a day. I'd like to have statistics for the different banners (which one was klicked how often per page view etc.) and general stastics about users and sessions. So far I only know about Google Analytics and AWStats. I already asked another question if Google Analytics supports statistics for other ad networks and obviously it does but only when using some workarounds. Which analytics software (inlcuding but not limited to Google Analytics and AWStats) would you recommend for my purpose? Ideally, it should be free and easy to use."} {"_id": "6754", "title": "When using email as login name, what precautions should I take for registration?", "text": "When your registration form is using a user ID that is the same as their email address, I'm somewhat concerned over spammers using the \"User name is in use\" as a means to validate the existence of an email, or if there are other concerns I need to know of for validation sake (for instance, converting CaMeLcAsE to lowercase seems to be the status-quo despite the RFC spec). This question is not about OpenID."} {"_id": "35584", "title": "Track sales and commission with third-party tool", "text": "I have a clothing website where I link to various clothing retailers. I have reached an agreement with one of the retailers whereby they will pay a commission to us for every sale they make from traffic that was referred by our site. I need a mechanism for tracking how much commission should be paid to us, that involves as little work as possible to implement from their side. We both have Google Analytics. **Option 1:** They record a goal in their GA account whenever someone makes a purchase on their site. They see how many completed goals are marked as referral traffic from our site and calculate commission accordingly. The problem with this is that the whole process of calculating and paying commission will be manual. They will need to frequently check how many sales were generated by referral traffic from our site, and probably we will have to chase them for commission payments. Also - since we won't have access to their GA data - we will need to trust that they report all sales accurately. **Option 2:** Sign them up to an affiliate network like Commission Junction or Google's Affiliate Network, and connect to them through this network. The problem with this solution is that it seems too heavyweight; ideally we don't want to ask a retailer to go through the whole sign up process just to deal with us and pay us commission. I am assuming that there must be some lightweight service that tracks the number of sales by one site and pays commission accordingly to the other site, where the sign up and installation procedure is simple and fast."} {"_id": "26825", "title": "Need a good, free, light help desk package", "text": "> **Possible Duplicate:** > Helpdesk software I'm launching a new web site that needs a basic help-desk system-- something that will track ticket emails, keep a log, have some basic statistics and possibly power a knowledgebase/FAQ section on my site. My ideal solution would be hosted (I'd rather not lose a day having to fuss around with installing software on the server), MODERN and CLEAN (read: not feel like it was designed or developed in 1997, which is what I keep coming across) and easy. Simple. Clean. Current. (You know, like the stuff 37signals puts out, or Fog Creek's CoPilot.) Because we're talking about no more than 10 tickets a week right now, it seems reasonable that I should be able to have the features for free now, to determine if it's right for me. Of course, I'm totally fine with a package that I will pay to grow into, as my needs expand. If I have to install on my server, it should be Linux based (CentOS). Zen Desk seems to nail it in the features department but feels like overkill for my (currently, very small) needs."} {"_id": "9940", "title": "Which mobile device is appropriate as a utility tool for a web master?", "text": "Basically, I'm looking for a device to use on the road and I would prefer to not have to sit down for the majority of the tasks (which rules out netbooks, in my mind). I'm also hoping to spend less than $500. This is what I'd like to \"capably\" be able to do on the device: * Browse the web in non-mobile format, flash is a plus * Email, chat, etc * Have access to a decent text editor and ftp OR a browser that supports BESPIN/ACE * Some sort of SSH support I'm looking at rooted Android phones and iPhone/iPads... though the phone aspect is only icing (it would be cool to consolidate the two devices and have net access through cell networks, but I'm not married to the idea). Are there cheap linux tablets that are ready for prime-time yet? I suppose that would be ideal. All suggestions welcome!"} {"_id": "35587", "title": "Google Fetch issue", "text": "When I do a Google fetch on any of my webpages the results are all the same (below). I'm not a programmer but I'm pretty sure this is not correct. Out of all the fetches I have done only one was different and the content length was 6x below and showed meta tags etc. Maybe this explains other issues I've been having with the site: a drop in indexed pages. Meta tag analyzer says I have no title tag, meta tags or description even though I do it on all pages. I had an SEO team working on the site and they were stumped by why pages were not getting indexed. So they figure it was some type of code error. Are they right? HTTP/1.1 200 OK Cache-Control: private Content-Type: text/html; charset=utf-8 Content-Encoding: gzip Vary: Accept-Encoding Server: Microsoft-IIS/7.5 X-AspNet-Version: 4.0.30319 X-Powered-By: ASP.NET Date: Thu, 11 Oct 2012 11:45:41 GMT Content-Length: 1054 "} {"_id": "6643", "title": "How do I buy this domain name?", "text": "> **Possible Duplicate:** > How can I buy a domain that has already been registered? How can I buy the domain name `www.autospell.com`. It seems to have a weird status with no website, but also it is not available to buy."} {"_id": "12397", "title": "Getting a domain name owned by someone else?", "text": "> **Possible Duplicate:** > Buy a registered domain? I'm in the process of creating this website, but the name I wanted for the hosting seems to be taken. Namejet seems to be auctioning it, but it seems a bit too fishy for my taste. I go to the link and there's nothing there, just ads. Can anyone recommend what the best way to get a domain owned by someone else is? My only alternative seems to be making an extremely long one nobody will remember! Sorry for the extremely noob question, but hey, need to start somewhere."} {"_id": "49004", "title": "How to I register a domain name that has been registered?", "text": "I am writing to ask about a domain name I wish to own. I have contacted the domain owner (by performing a WhoIS search) by email and have not received a reply. I have tried contacting him for over a year. I have also contacted the domain registrar (`namecheap.com`) where the domain is registered, and they have told me that they cannot contact him on my behalf, and told me to contact him using the email address. Are there any ways of securing this domain name for myself? I have waited for it to expire but unfortunately, it automatically renewed!"} {"_id": "35582", "title": "FTP client says directory permissions are 0000", "text": "I encountered a problem that appears to be related to permissions (same as this one). When I looked at the permissions of various directories in FireFTP, they all were 0000, which is obviously nonsense, since all the other pages are served without any problems. I only have ftp access, and I think it\u2019s a Windows server (The header says \"Server: Microsoft-IIS/7.5\"). I\u2019ve tried a different ftp client, with the same result. Changing the permissions results in a \"'SITE CHMOD 764 [directory]': command not understood\" error. A search on Google and SO for \"permissions windows 0000\" and similar queries didn\u2019t turn up anything. How can I view and change the actual permissions?"} {"_id": "39035", "title": "Problems Using CloudFlare On Blogger", "text": "Here's the situation. I got a TLD for my blogger blog and set it up using the instructions from blogger. Blogger asks to: Add two CNAME records. For the first CNAME, where it says Name, Label or Host enter \"www\" and where it says Destination, Target orPoints To enter \"ghs.google.com\" . For the second CNAME, enter \"NHRILA4K2RJG\" as the Name and \"gv-GQMUMYGHAMJWECXFLJXVXABIV23C55JIPNIAVD5IGFSXT653O5GA.domainverify.googlehosted.com.\" I did that on my domain host, and everything was working smoothly. Here's the things that happened: * Typing myblog.blogspot.com in the address bar brought me to my new address www.mynewaddress.tld * Typing my newaddress.tld brings me to www.mynewaddress.tld Now, I went through the instruction to setup CloudFlare and did everything as required. I saw that CloudFlare is active and working on my TLD www.mynewaddress.tld, however, when I am typing the blogspot address, i.e. myblog.blogspot.com, it's showing a notice that the blog is not hosted on blogger and that I should click \"yes\" to get redirected to the new website. However, the blog is still on blogger. I think the problem might be with this particular CNAME record Google asks to create, which I did not find imported to the CloudFlare nameservers: For the second CNAME, enter \"NHRILA4K2RJG\" as the Name and \"gv-GQMUMYGHAMJWECXFLJXVXABIV23C55JIPNIAVD5IGFSXT653O5GA.domainverify.googlehosted.com.\" So I create that CNAME and added it to the CloudFlare panel. My question is - is that what will help Google determine that my blog is still hosted on Blogger? If so, should I turn off CloudFlare for that particular CNAME record or turn it on? Any help is very much appreciated :)"} {"_id": "65415", "title": "CloudFlare - Difference between Basic, Simplified, and Aggressive caching", "text": "CloudFlare has this page to explain the difference between their caching settings: https://support.cloudflare.com/hc/en-us/articles/200168256. However, I'm still confused. It says: Basic: example.com/pic.jpg Simplified: example.com/pic.jpg?ignore=this-query-string Aggressive: example.com/pic.jpg?with=query So I'm assuming that Basic means it doesn't pay any attention to the query string (it's treated as a single file no matter what query string is included). Simplified seems to say any query string will be ignored. And Aggressive seems to say the query string basically means it's a different file. So, my question is, what's the difference between Basic and Simplified? If Basic ignores any query string, and Simplified also ignores any query string, aren't these the same?"} {"_id": "9949", "title": "Do you know a good html mailing list management software with admin levels?", "text": "I'm basically looking for a program/app/script (can be commercial) which I can ideally install on a windows server (we can run asp, asp.net php mssql) we have different groups of people who send newsletters to web members, I want to bring it all into one app which I can monitor and control. Ideally it would be able to create html newsletters, (with some templates) track emails and click throughs. Manage email lists subscribe/unsubscribes. And importantly have different levels of admin, so a newsletter creator could log in and create and send off an email, it goes into a queue where a communications editor can have an overview of all newsletters and approve the sending of the emails or edit them before they are sent off. before I start coding something up myself I thought I'd ask if anyone has any advice! Cheers!"} {"_id": "34567", "title": "Looking for full web based booking system", "text": "I've searched around here but could not find any booking system that would suit my need. I hope maybe you can help me out. I've already seen question #11379. Solutions provided there are not helping and scripts not interesting. I'm looking for a booking system that support infinite number of hotels/rooms/pricingschemes. There should be unlimited number of hotel owner/users, who can access control panel and set availability for their hotel. Or create seasonal prices and adjust them. I should be able to have 15% booking fee paid through credit cart. I can work on integration with my system, it just needs to have correct APi/functionality to support this. I love wordpress so if there are some nice plugins for that, I'm open for suggestions. But this is not a must. It should be php/mysql based as I'm not good at anything else :)"} {"_id": "32965", "title": "Responsibility in case of copyright infringing user-submitted content", "text": "Not sure if this is the right place to ask, but it seemed like one of the better options. Example here. Say, I have an online library where users can submit e-books and PDFs they created. All fine and dandy, until someone comes around and starts uploading books that are not their own. Sure, we have a moderation team in place that regularly checks new additions, but some may slip past the radar, especially when dealing with fairly unknown authors. The author of the (illegally) uploaded books finds the site, browses it for a bit, and to his shock finds that his books are available there for free. Copyright infringement! They sue me for hosting the library website, and I get in deep trouble because some user did something he wasn't allowed to. How can I avoid getting in trouble over this? Will I be safe is I state in the Terms of Use that it is not allowed to upload copyright infringing material, and that all users are responsible for what they upload? Or does the \"web admin is responsible for his sites and their content\" clause negate all others? Thanks! UPDATE: Also asked a lawyer online. They gave an answer pretty close to what the people on here said. See this link."} {"_id": "39032", "title": "How can I create a individual \"Projects\" page in Jekyll?", "text": "On my Jekyll powered website, I have a page where I list every project (inventions, experiments, etc) that I have ever worked on. Currently, each project listed exists only as a link to a relevant blog post. My question is, how do I create individual pages for each project so that it each resides in its own subdirectory like so: example.com/projects/project1 example.com/projects/project2"} {"_id": "56474", "title": "Will running an Azure website on a subdomain of azurewebsites.net hurt SEO compared to buying a domain?", "text": "When hosting a website on azure websites I have two options: 1. Be a cheapskate and go for the free option without a domain. I would then setup web forwarding with my domain registrar from my domain to the `domain.azurewebsites.net` address 2. Go for the paid option the and setup a CNAME and A record so that traffic to my domain (with or without the `www` prefix) is routed directly to the site Could anyone please advise what impact going for option 1 will have on Google/Bing ranking performance compared with option 2?"} {"_id": "19245", "title": "Customized 404 page on yahoo webhosting", "text": "I want to add my customized 404 error page. On Apache server an entry made in .htaccess file does this task.I have my website on yahoo web hosting. Yahoo does not support .htaccess file. Can anyone please guide how this can be achieved on yahoo web hosting."} {"_id": "16216", "title": "How can I identify unknown query string fragments that are coming to my site?", "text": "In the Google Analytics content overview for a site that I work on, the home page is getting many pageviews with some unfamiliar query string fragments, example: /?jkId=1234567890abcdef1234567890abcdef&jt=1&jadid=1234567890&js=1&jk=key words&jsid=12345&jmt=1 (potentially identifiable IDs have been changed) It clearly looks like some kind of ad tracking info, but noone who works on the site knows where it comes from, and I haven't been able to find any useful information from searching. Is there some listing of common query string keys available anywhere? Alternatively, does anyone happen to know where these keys (`jkId`, `jt`, `jadid`, `js`, `jk`, `jsid` and `jmt`) might come from?"} {"_id": "16743", "title": "What dodgy scams are there to get to the top of Google searches?", "text": "I was speaking to a relative and they mentioned that the designer/developer who produced their site had a way of getting them to the first page of google, I did advise him that this could be potentially dodgy and if he's not using legit SEO techniques then you could get thrown out. So what dodgy packages are there at the moment that he could be using, because from the look of the site and the number of orders it's taking, I really can't see how it could've organically risen to the top. Any ideas?"} {"_id": "19246", "title": "Should you add version information to images on your website?", "text": "If you have images on your website that are likely to change (e.g. have a different watermark), but the same image name, should you add version information to the end of the image name, or would this be bad for SEO purposes? I have an issue where we have changed the watermark on a number of images 5 months ago, but Google Image search still shows the old watermark on the images. So I am assuming that Google hasn't downloaded the new images as the filenames are the same as the old ones."} {"_id": "19241", "title": "How to avoid leeching files?", "text": "I use Kleeja script to host files and noticed that certain files are downloaded many more times than the site's visit counter indicates. So I assume that the files are being leeched. The web server is Nginx. Just wondering how to block leeching. Thanks"} {"_id": "16744", "title": "Embedding *.swf objects: javascript vs xhtml", "text": "I am a newbie in webdesign and I\u00b4ve been wondering which is the _modern_ way to embed swf objects. I've been using the following html code: However, I've noticed that some designers use a javascript to embed swf files. Like this one It seems that both ways produce the same results but: **what are the shorfalls of these methods?** **which method is the newest ?** **Are there any compatibility issues regarding the web-browsers?** Thanks in advance. I've posted this question in stack-overflow too. I don't know exactly where this question should be posted."} {"_id": "19931", "title": "How do I grant 777 (rw) permission in shared windows hosting", "text": "I have a windows shared hosting on which I have deployed my wordpress blog. Now to icrese my page speed I am trying to cached few resources using W3C Total Cache While trying to activate the plugin I am getting the following error: > E:\\inetpub\\vhosts\\subhendu.info\\httpdocs\\blog/wp-content/dbcache could not > be created, please run following command: chmod 777 > E:\\inetpub\\vhosts\\subhendu.info\\httpdocs\\blog/wp-content then Retry I tried creating few directories using FTP but realized it wont be of any use as caching files also can not be placed dynamically. I tried changing the chmod to 777 using FileZilla but getting the error: **_500 'SITE CHMOD 777 wp-content': command not understood_** mostly because this is a windows hosting. Again I do not have to shell access to my server , I have to everything using FTP. If there is no other way and I have to ask my hosting provider for support what shall I ask him? Does he need to grant RW permisiion to any specific users?"} {"_id": "16746", "title": "Planning for catastrophe", "text": "I work for a small marketing company that also does web design and development. We host all of our web design and development customers on a dedicated server at Hostgator. We have a dedicated server with RAID 1 configured hard drives. We also do weekly backups which is automated through cPanel and downloaded by automated FTP software locally. Today we were discussing what would we do if Hostgator had a catastrophic failure of some kind. It could be the server exploded, Hostgator had serious network issues, the FBI did one of their famous \"take every server we see\" raids, etc. Basically any scenario where an extended outage is expected. We then took it to the next level and wondered what would we do if Hostgator had an extended outage and we were unable to access our local backups. This could be due to fire, flood, etc. I know the odds of our server being down for an extended peiod of time _and_ our local files simultaneously being inaccessible are remote but all it takes is just _two_ bad things to happen and that's where we would stand. (If you have ever gotten a flat tire and found out your spare was flat or missing you know how easy it is for two bad things to happen simultaneously really is). Needless to say we want to be prepared for \"worst case scenario\" type events as this would almost certainly put us out of business. So my two questions are: 1. What could we do to be prepared for an extended outage by Hostgator? An ideal scenario will have our clients' websites, and hopefully emails, up and running again quickly. 2. What would a robust back up plan include so important data is never lost? An ideal solution will be automated. You can assume cost is not an issue in your answers but the more affordable a solutions is, the better."} {"_id": "57782", "title": "Does the Duplicate Content penalty apply to mobile apps?", "text": "Our website content and mobile app content are the same. Will Google assess a duplicate content penalty against our web site as a result of this?"} {"_id": "58727", "title": "Should Schema.org be used on embedded YouTube videos or only self-hosted videos?", "text": "Google supports and recommends using the schema.org on-page markup for videos. However it is not clear if this should also be done when embeddeding a YouTube video. Or really even a Vimeo video. Or is the schema.org only necessary when hosting the videos yourself?"} {"_id": "49284", "title": "Which folders need to be backed up for migration in Joomla?", "text": "I'm helping someone update & migrate their old website, built on the Joomla framework. Currently it is running on Joomla 1.5.8 which is an ancient version. I've convinced them to upgrade Joomla to at least 2.5 I have already made a backup of the database. Most links I have seen talk of backing up the entire `public_html` folder (The website runs on a shared host). But in my fresh Joomla installation there are several folders that are in the `public_html` folder. So which of the folders in the `public_html` folder are from the content of the website, and which are of the old Joomla framework? I'm afraid that I might overwrite files of the new Joomla framework with the old framework, if copy all the files and folders into the new installation."} {"_id": "32729", "title": "How can I get my dynamic site search results content indexed by Google?", "text": "I have a site that is simply a search box to search a cloud-hosted database of .tiff images, and then all of my content can only be accessed by entering a search term. So for example, you're on the home page www.example.com and you type in \"search\" to the box and hit submit. Then it takes you to www.example.com/?q=search, which is a page of all my .tiff images with \"search\" in the description. How can I get a page like www.example.com/?q=search indexed, WITHOUT making a humungous list of search terms that people might type in?? I know about mod_rewrite, but it seems like for that you need to know ahead of time which URLs you'll need to convert, which I don't. All of these pages will be dynamically user-generated by typing into the search field. Please help!"} {"_id": "19939", "title": "How to add Default Document in a Windows Shared Hosting", "text": "I have a shared hosting plan where I have hosted my WordPress blog Now each time I want to access the page I have to type index.php after the domain name. If I try to access it without using index.php I get 404 error. Is this because index.php is not added to Default Documents list? Is there anyway I can add the default document list using plex control Panel?"} {"_id": "16218", "title": "URL rewrite to fix broken links into a subdirectory -- subdirectory was moved to the root directory", "text": "I had an e-commerce website hosted on `http://mydomain.com/beta` for more than a year, eventually I decided to move the website to root `http://mydomain.com` I had done quite a lot of link postings to forums etc, when my site used to be hosted in the sub-dir /beta. Is there a way to do a mod_rewrite by which all the old links that I have posted do not return as broken links, since now longer the site is hosted in /beta and is now hosted on the site root. I did read that mod_rewrite can help resolve this issue, but also read about that this has to be done with care. Just a tip that this site is using friendly URL."} {"_id": "42789", "title": "Google Analytics - include filter not working", "text": "I just added an include filter this morning in my domain (`test.org`). I have: _Custom Filter > Include > Request URI >_ `^/test-a/46212$|^/test-a/46212|^/test-a/46315` Now after I go to _Content > Site Content > All Pages_ , I see stats for other pages that I didn't include in my filter. For example I see `/somethingelse`. I only want to see stats for `/test-a/46212` and whatever else in my filter. Please let me know what I'm doing wrong."} {"_id": "12053", "title": "Using Freelance Services", "text": "Can anyone share any advice or experiences using any of the many freelance gun for hire sites out there like guru.com? I may be leading a small development team soon to run a project that will from time to time need outside help and I'd rather farm out some components that can be well defined up front in the event that the team and myself get overwhelmed or behind. I'm thinking everything from local developers on craigslist to some outsourced services, but haven't had much experience with either. Thanks"} {"_id": "53309", "title": "How to find out which sub-domain has been indexed more in Google?", "text": "I have some sub-domains. I use \"site:mydomain.com\" in Google search input to find out the number of pages Google has indexed from my domain overall. How can I find out which sub-domain has been indexed more? Can I do it for others domain?"} {"_id": "51423", "title": "Do the \"Contact us\" and \"Privacy policy\" pages affect SEO?", "text": "Just like the title says, what are the effects of having a \"Contact us\" and a \"Privacy policy\" on your site? I've read that it could build up your trust with Google, is this true? I've also read that some people said that you should add a `noindex` tag to your \"Privacy policy\" page, would this be a good idea? I say this because many websites have similar privacy policies, and I don't want any duplicate content issues. (For example, many people could be using the same WordPress privacy policy generator). I'm wondering the same things for the \"Contact us\" page as well."} {"_id": "42784", "title": "How to create sharing icons with the share count number showing?", "text": "See a sample article on theonion.com ![enter image description here](http://i.stack.imgur.com/GVCoW.png) I know how to create icons for these sites without using their standard widget, but how did they get the number of share, bellow each icon ? Is there an API to ask for it ?"} {"_id": "55877", "title": "Original domain (which is 301 redirected) meta description doesn't update", "text": "I have a domain name `example.com` which I 301 redirected to `www.example.com`. Problem is when I search for the website on Google, it shows the `.com` rather than the `www`. So the `.com` meta description shows, and since there is nothing on it, it has GoDaddy stuff written all over it. I've re-crawled both sites using Google Webmaster Tools but still no luck. Anyone have any idea on how this can be updated? Or will I just have to be patient?"} {"_id": "56783", "title": "Will using same colour H1 as background hurt SEO if I use background image?", "text": "Can it hurt SEO if H1 headings are the same colour as the standard body background colour, even if all H1 headings are placed in divs that have a contrasting background image? (So to be clear: the heading will be clearly visible) To illustrate: CSS: body { color:#4e4e4e; background-color: #ffffff; } #section1 { background: url('example.jpg'); } h1 { font-weight: bold; color:#ffffff; } (example.jpg in this case would be an image with very dark colours) HTML:

    Lorem ipsum dolor

    While at first glance it may seem that H1 headings are hidden, in reality they are not because the white letters will contrast with the dark background image. Will Google (wrongly) see this as hiding text, as a result hurting SEO? And if so, would it then be better to define H1 colours per div like in the CSS example underneath? body { color:#4e4e4e; background-color: #ffffff; } #section1 { background: url('example.jpg'); color: #fff; } I ask this because, while the second example seems to make more sense if you want to make it clear you're not hiding text, I also have the feeling that this defeats the purpose of a separate CSS file. If I'm going to have to change the font colour for each div I use an H1 in, I might as well switch to inlining."} {"_id": "61694", "title": "Why do we have to wait 60 days between each domain transfer?", "text": "I understand that ICANN does make the rules requiring this, but I don't understand why we have to? Each time we do transfer the domain, we have to pay to do so. That partially stops fraud. I also do know that it can take from a few hours to a month for the transfer process to be completed on both ends. **Why do we have to wait 60 days between each domain transfer?** Also, allow me to quote the ICANN FAQ > **If I bought a name through one registrar, am I allowed to switch to a > different registrar?** > > Yes. The Inter-Registrar Transfer Policy, applicable to all ICANN-accredited > registrars, provides that registered name holders must be able to transfer > their domain name registrations between registrars. You must wait 60 days > after the initial registration or any previous transfers to initiate a > transfer. It states that it is a _requirement_ without an understanding of the reason why it is."} {"_id": "56787", "title": "How do I track automatic translation of web page?", "text": "My web site is posted in English. I do not have the Google Translate plugin installed, nor do I have any plans to install it. However, I'm inferring from some of my analytics data that people visiting my web site are using Google Translate to translate my pages. I presume they're visiting my site and seeing Google's \"This page is in English. Would you like to translate it to [their language]?\" and clicking \"Translate\". Is there any hook in Google's automatic translation, e.g. some event fired, that I can use to detect these automatic translations and fire a Google Analytics event tracking the translation and hopefully capturing the language they're translating to? Note: I've seen this post, but the answer refers to the plugin, which I'm not using. I want to track when Google volunteers to translate automatically."} {"_id": "56786", "title": "Changing subdomain url to new root url issue (404)", "text": "I have a blog on a subdomain `technotes.tostaky.biz` and I want to move it to a new URL: `www.mytechnotes.biz`. Of course, I want to redirect `technotes.tostaky.biz` to `www.mytechnotes.biz`. `www.mytechnotes.biz` has been set-up properly, but I am trying to figure what I should do for the redirect. Right now, I have two A records (`www.technotes` and `technotes`) pointing to the IP address of the root `tostaky.biz` URL server.I also have a permanent redirect (both www and non-www) from `technotes.tostaky.biz` to `http://www.mytechnotes.biz/`. When I open `technotes.tostaky.biz` in Chrome, I get a 404 \"The requested URL / was not found on this server. That\u2019s all we know.\" I don't know what is causing this issue. Should I wait for propagation or do I need to modify my configuration? Should I use CNAMEs instead of A records? **P.S:** I forgot to mention that `www.technotes.tostaky.biz` redirects properly."} {"_id": "28422", "title": "Site failing randomly - could it be Cloudflare or something weird in the JS?", "text": "I've been working on a simple site that uses javascript to fade through some fullscreen background images as well as some other simple animations. I've tested the site on Chrome, Safari, FF and Opera on OSX, IE8+ on Win7 and Chrome & FF on Ubuntu and everything looks as I'd expect it to. However, I've had reports of the site failing to load (stops at the stage where the background fades up) on Safari and Chrome on OSX and Win. I can't replicate this on any setup so I'm finding it impossible to troubleshoot. Google's instant preview shows the site fine as does most of the options at browsershots.org so I'm really scratching my head. I'm running the site's traffic through Cloudflare and I'm wondering whether anyone can see (or knows from other sites) why Cloudflare might be mangling the JS or causing a problem somehow (I don't get any errors in the JS error console). Of course, if you can replicate the problem on your machine and can suggest an area to look at that would be amazing but I'm hoping that, like me, you don't see any problem with the site! Here's the site: http://www.bighornrevelstoke.com Thanks, James UPDATE: I am very grateful to those of you who have taken the time to made suggestions below but because of my poor _'n00b'_ reputation I'm unable to upvote any of the suggestions to show it. Sorry about that."} {"_id": "26800", "title": "How can I prove to google that a godaddy based site with instapage in my site?", "text": "I have registered a domain with godaddy and do not have their hosting services just the free instapage. I want my domain to be actually hosted at google apps. I have got to the point in google setup where google has asked me to verify that this my domain by adding some metadata in the header of mypage (there are couple of other options as well). The problem is that because I am not using godaddy's hosting I can not add any thing to the head as instapage only give me edit access to body and some templates. Is there a way I can verify that this my domain with having to pay for godaddy's hosting ?"} {"_id": "25714", "title": "Which Filing System Is Better?", "text": "I realise that this is a bit stupid, but I'm a bit lost here - I've recently decided to re-arrange my webserver's content filing system so that it is more efficient and flexible in regards to the addition of content in the future. Could someone advise on which way will be more beneficial in the long-term? 1.) My current, original filing system: I currently have a folder for each web page: index, contact, about, etc. Inside each webpage folder I have a folder for each type of content that is on that webpage: css, js, images, flash, pages, etc. Examples: `http://example.com/contact/images/email.png` or `http://example.com/about/flash/copyToClipboard.swf` 2.) My proposed, planned filing system: I plan to have a folder for each type of content. Inside each content folder I plan to have a folder for each webpage. Examples: `http://example.com/images/index/welcome.png` or `http://example.com/js/index/animate.js` General summary: is it better to keep each page's content isolated (filing system #1) or to keep all kinds of content together (filing system #2)? Thanks :)"} {"_id": "25715", "title": "Will I preserve link equity if I use two redirects?", "text": "I'm switching a social media site over to a Drupal platform. I'm planning to implement permanent directs (301) from each old url to the new url. I'm using page titles in the url string on the new site: http://www.newsite.com/this-is-my-page-title From a technical standpoint it would be easier for me to redirect to the page id: http://www.newsite.com/node/123 which automatically redirects (with a 301) to the above title url. Is there a problem using two redirects or is it better to just use one?"} {"_id": "43785", "title": "Accessing website from within the LAN via web address", "text": "I run a small website at my house; just started it up the other day running Wordpress. The problem I'm facing is fairly obvious: I need to access my website, but I can't use the URL due my hosting on the same network I'm trying to access from. I use Chrome, and when I try to load the page it says that it simply could not connect. My question is simple: how can I access my site, Wordpress running through Apache on Ubuntu 12.10 LAMP stack, on the same network that I'm hosting it from without the use of proxy or VPN?"} {"_id": "58409", "title": "SocialEngine 4.5 redirect loop problem", "text": "I am installing SocialEngine 4.5. Here is the .htaccess file in the first level of the SocialEngine `/public_html/social_engine2/` directory: # $Id: .htaccess 7539 2010-10-04 04:41:38Z john $ # For security reasons, Option followsymlinks cannot be overridden. #Options +FollowSymLinks Options +SymLinksIfOwnerMatch RewriteEngine On RewriteBase / # Get rid of index.php RewriteCond %{REQUEST_URI} /index\\.php RewriteRule (.*) index.php?rewrite=2 [L,QSA] # Rewrite all directory-looking urls RewriteCond %{REQUEST_URI} /$ RewriteRule (.*) index.php?rewrite=1 [L,QSA] # Try to route missing files RewriteCond %{REQUEST_FILENAME} !-f RewriteCond %{REQUEST_FILENAME} public\\/ [OR] RewriteCond %{REQUEST_FILENAME} \\.(jpg|gif|png|ico|flv|htm|html|php|css|js)$ RewriteRule . - [L] # If the file doesn't exist, rewrite to index RewriteCond %{REQUEST_FILENAME} !-f RewriteCond %{REQUEST_FILENAME} !-d RewriteRule ^(.*)$ index.php?rewrite=1 [L,QSA] # sends requests /index.php/path/to/module/ to \"index.php\" # AcceptPathInfo On # @todo This may not be effective in some cases FileETag Size This file is in the `/public_html/social_engine2/install/` directory: # $Id: .htaccess 7554 2010-10-05 03:40:58Z john $ # For security reasons, Option followsymlinks cannot be overridden. # Options +FollowSymLinks Options +SymLinksIfOwnerMatch RewriteEngine On RewriteBase /install/ # Get rid of index.php RewriteCond %{REQUEST_URI} /index\\.php RewriteRule (.*) index.php?rewrite=2 [L,QSA] # Rewrite all directory-looking urls RewriteCond %{REQUEST_URI} /$ RewriteRule (.*) index.php?rewrite=1 [L,QSA] # Special cases RewriteCond %{REQUEST_URI} static # RewriteRule (.*) Boostrap.php [L,QSA] RewriteRule (.*) index.php?rewrite=1 [L,QSA] # Try to route missing files RewriteCond %{REQUEST_FILENAME} !-f RewriteCond %{REQUEST_FILENAME} public\\/ [OR] RewriteCond %{REQUEST_FILENAME} \\.(jpg|gif|png|ico|flv|htm|html|php|css|js)$ RewriteRule . - [L] # If the file doesn't exist, rewrite to index RewriteCond %{REQUEST_FILENAME} !-f RewriteCond %{REQUEST_FILENAME} !-d RewriteRule ^(.*)$ index.php?rewrite=1 [L,QSA] php_value post_max_size 100M php_value upload_max_filesize 100M php_value max_input_time 600 php_value max_execution_time 600 php_value memory_limit 128M # @todo This may not be effective in some cases FileETag Size Please let me know if there is something wrong with the code."} {"_id": "43834", "title": "PHP not enabled by default on GoDaddy shared linux web hosting", "text": "I recently made the mistake of starting a web hosting account with GoDaddy. Let me start by saying that any comments or answers that I should walk away from the money I spent with GoDaddy and go with someone else will be flagged. I have heard this countless times and do not wish to hear it again. Perhaps I am naive but I thought that PHP was enabled by default on shared web hosting. Certainly PHP should be enabled, my account started with two webform mailer scripts, webformmailer.php and gdform.php. There also exists a php.ini file that cannot be modified or deleted. It seems to prevent certain things like registering globals and other security concerns on a shared hosting environment. PHP is not enabled though. A simple test script spits back text, telling me the PHP handler doesn't exist or it is just not installed. The above just returns text. There was no .htaccess file so I decided to add one myself. It made no difference. > AddHandler application/x-httpd-php .php > > Options +ExecCGI +FollowSymLinks The php.ini file that they provided in my account is as follows: register_globals = off allow_url_fopen = off expose_php = On max_input_time = 60 variables_order = \"EGPCS\" extension_dir = ./ upload_tmp_dir = /tmp precision = 12 SMTP = relay-hosting.secureserver.net url_rewriter.tags = \"a=href,area=href,frame=src,input=src,form=,fieldset=\" ; Only uncomment zend optimizer lines if your application requires Zend Optimizer support ;[Zend] ;zend_optimizer.optimization_level=15 ;zend_extension_manager.optimizer=/usr/local/Zend/lib/Optimizer-3.3.3 ;zend_extension_manager.optimizer_ts=/usr/local/Zend/lib/Optimizer_TS-3.3.3 ;zend_extension=/usr/local/Zend/lib/Optimizer-3.3.3/ZendExtensionManager.so ;zend_extension_ts=/usr/local/Zend/lib/Optimizer_TS-3.3.3/ZendExtensionManager_TS.so ; -- Be very careful to not to disable a function which might be needed! ; -- Uncomment the following lines to increase the security of your PHP site. ;disable_functions = \"highlight_file,ini_alter,ini_restore,openlog,passthru, ; phpinfo, exec, system, dl, fsockopen, set_time_limit, ; popen, proc_open, proc_nice,shell_exec,show_source,symlink\" After contacting customer support they insist that there is nothing wrong with my account. I must be doing something very stupid here because I can't find any instance on the web where somebody experienced a similar problem with GoDaddy."} {"_id": "39162", "title": "Support same url in different forms", "text": "I have seen many websites supporting multiple forms of the same URL. For example consider www.example.com/question1/ OR www.example.com/Question1/ OR www.example.com/QUESTION1/ etc. all lead to one page with say or a 301 redirect to www.example.com/question1/. Does this affect the page rank anyhow or its just for seemless user experience or there is some other reason behind this? Infact even stackoverflow/stackexchange does this. No matter what the text after the id of the question in url they redirect you to the correct question!"} {"_id": "43838", "title": "Plesk 11 preview site with hosts file", "text": "I want to preview my website by manipulating the hosts file, linking to the server's URL but it's not working - My browser always tells me \"the connection has been reset during page load\". What am I doing wrong? I edited my hosts file like many tutorials out there suggest and I checked (wireshark) that the browser connects to the correct IP (that of my server). Yet, I can't see anything of my website...?"} {"_id": "4941", "title": "How do I build backlinks?", "text": "I've recently made a site and I want to know how to go about making backlinks?"} {"_id": "55979", "title": "What are new backlinking techniques", "text": "I am in search of new backlinks methods. I do know most of the traditional methods of acquiring back links but they are no more in the play. I there is something out of the box or not so common, please share your knowledge and wisdom."} {"_id": "64625", "title": "How to increase backlinks of blog or websites", "text": "I know that this question is very easy and also Silly.But I don't know how to make **backlinks to my blog**. I have tried **commenting in various blogs and websites** but in alexa there is **only 1 backlink** which is **my own blog**. Do anyone know how to make a **quality backlinks for blog or website**.... I also want to know that by increasing **Backlinks** , **SEO** of my blog imporves??? _Thanks in advance......_"} {"_id": "25718", "title": "Use Addon domain within login name for FTP account in cPanel", "text": "Let's say I have a cPanel (version 11.30.6 (build 3), cPanel Pro 1.0 (RC1)) shared hosting account for `myDomain.com`. I also have several other \"Addon\" domain names on this account, each with their own document root. Under the cPanel FTP Accounts page, it only allows me to do this... **Server:** `ftp.myDomain.com` **Login:** `name@myDomain.com` However, I want to setup an FTP location at one of my \"Addon\" domains so that it looks like this... **Server:** `ftp.myAddonDomain.net` **Login:** `name@myAddonDomain.net` I'm not seeing if it's even possible, but hoping it is, and somebody here would be able to explain how. In other words, cPanel allows me to create an email account at literally **any** of my Addon domains or sub-domains. I'm wondering if I can create FTP accounts in the same fashion... but it's looks like I cannot."} {"_id": "64695", "title": "backlinks: two domains with same IP", "text": "I run several different web pages on different servers (with different IP addresses). These pages are linking to each other in order to boost number of backlinks pointing to my pages. I would like to move all those projects to a single virtual host (with a single IP address). My question is, how Google handles links within different domain names but same IP address. Is there some penalization for it? Could this lead to lower PageRank?"} {"_id": "21875", "title": "Do premium domain names help us with other languages too?", "text": "It's commonly known that premium domains with one or two relevant keywords may help us improve our rankings in SERPS. But would it be possible that an english premium domain, for example gold.com (no, it's not mine) also helps to drive more non-english traffic (I'm talking about non-english pages ob)? ## Trying to make my question clear: Let's suppose that I have an english premium domain with a page like this: gold dot com/post/123/gold-is-yellow And decide to have a spanish, portuguese or french version of the site with pages like: gold dot com/es/post/123/el-oro-es-amarillo gold dot com/pt/post/123/o-ouro-e- amarelo gold dot com/fr/post/123/fsdfsdfsdf The fact that my english domain is a premium one and highly relevant for english terms, will also help me to achieve good rankings for non-english searched terms like: oro (spanish) or ouro (portuguese)?"} {"_id": "49523", "title": "Is my PageRank/linkjuice affected after 301 redirect and losing social likes?", "text": "I had to restructure my URL's and had to introduce some 301 redirects. However, then the Facebook likes and Google +1's of the old pages are also lost. Besides it's a shame those numbers don't show anymore, is my PageRank/linkjuice affected by the loss in likes/+1's? P.S.: I know a 301 redirect by itself results in a slight linkjuice loss, but that's not what I'm asking."} {"_id": "21652", "title": "What's the best way way to download files from a live blog?", "text": "I'm having trouble figuring out a way to download files from a live blog. To be more specific, I'm trying to download the directory and php files that this blog consist of, and work on those files on my local environment while still keeping the original blog live on the net for the time being. I tried to use FTP(filezile) to download the files remotely, but it failed to open the files for writing and the file transfer failed. I then attempted to go into godaddy control panel and change the file permissions but, for some reason, it won't let me do that. Is there a better way to do this?"} {"_id": "21655", "title": "remove 'noindex' meta tag, under Tumblr", "text": "Is it possible to remove the meta tag for search engine 'noindex' in a tumblr blog ? I have tried all options under the dashboard. I think Tumblr add this option automatically when the HTML page is generated. If I wrote another meta tag before, in a custom HTML style, is it used by googleBot or other ? > < meta name=\"robots\" content=\"noindex\"/ >< meta http-equiv=\"x-dns-prefetch- > control\" content=\"off\"/ >< /head > > < body > ... Also, these scripts are in the template : >"} {"_id": "21870", "title": "pingback / trackback support for a photo sharing website?", "text": "Is there a photo sharing service, such as flickr or picasa, that will collect the urls of the locations where the photo has been posted on other blogs (or mentioned in tweets, etc?) This could be accomplished by posting each photo as a blog entry using **wordpress** , which would then automatically handle pingbacks, but of course a blog doesn't perform quite like a proper photo service. Perhaps this could be done with a private photo hosting server like **zenphoto** by editing the php, but that seems rather involved. Does such a service already exist?"} {"_id": "21873", "title": "YouTube fullscreen not displaying", "text": "For some reason YouTube videos in my website do not get the fullscreen button, even if I added the parameter `allowFullScreen` set to true both in the object and embed tag. Here's an example page: http://www.indievault.it/2011/11/09/indie-vault-alla- games-week-2011-online-la-video-gallery/ Just take a quick look at the source. The `allowFullScreen` param is there, but the button won't show. Here's an excerpt from the code in that page: "} {"_id": "21658", "title": "What could keep Chrome from downloading files?", "text": "I run a subscription-based site for poker training videos. In the recent past, some portion of our subscribers have developed problems downloading our videos. The affected users are exclusively seeing problems while using Chrome as their web browser, however there are other users who use Chrome with no problem. Here is the exact behavior they are seeing: * Pasting the url of a video file directly to their browser produces about:blank and the file does not download * Clicking the link normally produces no result (e.g. it behaves like a dead link) * Right-clicking the link and opening in a new tab/window produces about:blank and the file does not download * Right-clicking the link and choosing Save-As produces a 5 to 10 second delay, after which they get a Save As dialog. After they choose a location, the download proceeds at full speed. Again, this problem has only presented itself for a subset of users browsing with Chrome. All affected users have the latest version of Chrome, but so do some of the unaffected users. All affected users have Windows Vista or Windows XP. As far as we can tell (small sample size), none of the affected users have Windows 7 and none of our users on Windows 7 have been affected. What could be causing this problem? How could it be solved? Edit: Here is a copy of the headers from one of these downloads. HTTP/1.1 200 OK Date: Sun, 06 Nov 2011 18:35:56 GMT Server: Apache Last-Modified: Mon, 24 Oct 2011 11:57:29 GMT ETag: \"180cc292-5229699-4b00a208b3c40\" Accept-Ranges: bytes Content-Length: 86152857 Keep-Alive: timeout=10, max=29 Connection: Keep-Alive Content-Type: application/octet-stream Edit: Some updates. Chrome extensions do not seem to affect this. One of the affected systems was upgraded from Windows XP to Windows 7, with no other changes made. That system now downloads properly. This issue doesn't only affect the WMV videos. It happens with FLV and M4V as well. A sample file is at http://www.grinderschool.com/videos/zbn9Y7TbeWcUbCPFNnLd/Carroters001.wmv"} {"_id": "21879", "title": "Number of keyword phrases", "text": "Does the number of keyword combinations used for SEO of a given website matter? Is it better to have five or thirty five?"} {"_id": "52780", "title": "Do dynamic URLs have their own PageRank?", "text": "Do URLs with parameters have their own PageRank for the given parameters values or is the PageRank aggregated at basic page URL level ignoring the parameters?"} {"_id": "65549", "title": "Microdata reuse within another item", "text": "I've been reading around and I feel I've hit a wall. My intention is marking up my site with micro-data in a way I won't need to re-declare items that already exist in the page. For example, my header will always contain info regarding the \"Organization\" that the website represents. Within the \"Products\" that I offer I can specify the \"Brand\" of that product. So the product's property \"brand\" will replicate the \"Organization\" info mentioned in the header. Now I don't think I'm supposed to repeat the organization's markup within the brand property of the product item, since I feel there should be a way to reference that organization item directly. I just haven't found a way to do this, are there any ideas? I've checked this answer and although useful it doesn't address my issue. I tried messing with it however to give you an idea of my intentions: This doesn't work since it adds the the \"id\" property to both \"Organization\" and \"Product\". **##### EDIT1: #####** zigojacko's answer didn't really go towards what I was expecting, since he uses a single container with all the information needed to present all the markup for his product. However in my case the site layout isn't broken down into a container that would hold all the necessary information, as my company info is within the header and the products/offers are near the footer (as the main form to define those offers is above the fold) and other info is scattered between each of those containers. As such I would eventually like to link the company (that resides in the header) to the brand of each product (that reside near the footer), I just wouldn't like to repeat all the code necessary to describe the company and therefor would prefer linking to some sort of item identifier. There is also the possibility of nesting every bit of content within the page to the company, but as a comment to this question I mention that specific question in the webmasters-stack-exchange. So I know how to pull that type of solution off, I'm just wondering if there's the possibility of referring to an existent item elsewhere on the page. **##### EDIT2: #####** In the comments of this question I might have found a better way to explain what I mean. > By pointers I mean an html attribute that could reference an Item. In the > code example I gave, I have a loose (not nested) Organization that I would > like to be inserted into my second item Product within it's brand property, > but without having to copy+paste or echo again within the Product. I tried > to simulate that behavior with the itemref attribute."} {"_id": "34560", "title": "MODx give user group access to the content of a resource container but not the parent resource", "text": "I have a Category \"Articles\" and in there is the Category \"Travel\". I want to allow a user group to add and edit articles as children of travel, but they should not be able to edit the page travel. How can I do that?"} {"_id": "65540", "title": "Google webmaster reconsideration requests without manual actions", "text": "my site ranking decreases in Google results suddenly about ten days ago. I read many articles about it at support.google.com, but I didn't find the problem. Also there is no manual action in webmaster tools. is there any way to ask Google reconsider my site ranking or to say what is the problem of site? ![enter image description here](http://i.stack.imgur.com/Ypg6e.png) Background for my site: it is a pharmacy shopping websites. has ~700 product pages and almost all of the description of products is copied from producers website."} {"_id": "8429", "title": "Will AJAX-fetched material be invisible to search engines?", "text": "With the new method of doing things that I've put together, my page content is not actually on index.html or any of the other pages. Instead, it is in index.txt and merely fetched by AJAX. Is my content, which is in index.txt, going to be invisible to the search engine spiders? Or will it load the AJAX-fetched material and then analyze the whole thing once the material is fetched?"} {"_id": "20155", "title": "Bots and scripts", "text": "> **Possible Duplicate:** > Will AJAX-fetched material be invisible to search engines? Can bots understand what's going on in javascript or vbscript? Most of the social networking modules and ad codes consist of just a js file which appends an iframe (google+ button, facebook like button, facebook likebox, google ads etc). Do bots understand that there are iframes involved in the site? Do bots run javascript and check the result? Or at least try to understand what's going in that js?"} {"_id": "65544", "title": "Does having a single image referenced from multiple pages affect SEO?", "text": "I would like to have a collection of images that are rendered into the browser. If for example I had **Images:** [img1 ; img2 ; img3 ; img4 ; img 5;] **Category:** [cat1 ; cat2 ; cat3] **Problem:** Assume image 1 belonged in both cat1 and cat2. Would it be better for SEO purposes to upload image1 with different alt tags? OR should I upload the same image twice using different URL's? **_Note** : I have folder pattern such as_ * `/dir/cat1/` * `/dir/cat2/` * `/dir/cat3/` The image goes in the folder in which category it belongs. So if the image is best not duplicated, I will use the first choice."} {"_id": "2402", "title": "Why isn't my website in Google search results?", "text": "I've made a website one month ago and it still isn't indexed by Google. I haven't change any setting in the _robots.txt_ file. I've tried to search for `site:websitename.com` but I still can't see it, and I have a Google Analytics account. Why isn't my site in Google results?"} {"_id": "45264", "title": "Google not reading my website Content and Title, Meta tag and Description also?", "text": "I have a new website and my website Home page content, Meta tag also not reading google and not index, any one can help me my website: http://www.packerandmoversindelhi.in"} {"_id": "34231", "title": "How long does it take for Google Webmasters to index site after submitting sitemap?", "text": "> **Possible Duplicate:** > Why isn't my website in Google search results? I have submitted my website today into Google search using Google Webmasters using sitemaps. The status on the sitemap says OK and it shows that 12 urls have been recognized. I was wondering how long does it take for the link to get indexed, as the indexed url option says > \"No data available. Please check back soon.\" I am not sure if it is showing this message due to some error, or everything is fine."} {"_id": "68947", "title": "Google pulls keywords from only two pages", "text": "When I sign into Google Webmaster and view Google Index > Content Keywords, it only shows keywords off of a single page, and from my site map. It's a small site with five pages, but it's only getting keywords from the single page. Also when I go under Google Index > Index Status it shows only two pages have been indexed, and it's stayed that way for about a month. I assume that this is the site map and the page where all of the keywords are being pulled from. My robots.txt file allows all pages to be crawled, and if I search \"site:mysite.com\" it shows all of the pages. I can't seem to find a way to see what pages have been indexed in Webmaster. The site has been up for a couple of months. I have submitted a sitemap which has all of the pages. Why isn't Google gathering keywords from the other pages? How do I get the other pages indexed?"} {"_id": "18651", "title": "Pages are not searchable by Google", "text": "> **Possible Duplicate:** > I cannot see my website in google After migrating my website (based on Drupal CMS) to WebHosting (it's about two months there), I noticed that my pages aren't searchable by Google. It finds nothing, even though it should clearly find my website. Same thing happens when I want to know the ranking of my website. Do you know where the fault could be?"} {"_id": "54433", "title": "A relatively new blog seems to be getting very poor Google indexing", "text": "I have a new blog that is 2 months old. In the first few weeks, it was getting indexed nicely and my GoogleWebmaster reports were showing that it was getting crawled and began ranking for some terms. Then as I kept writing, the GoogleWebmaster report thinned out and showed less and less terms that this blog ranks for. Now there are only 4 terms with one of them being my name. Is there something I need to do to keep the old posts to remain indexed and crawled? Thanks, Alex"} {"_id": "22613", "title": "How to make google get to know my domain name", "text": "> **Possible Duplicate:** > I cannot see my website in google I have a strange problem with my website. I have a website, let's say Abcdefg.com. Website is live for 2 months and google still doesn't know it. While searching for my domain name 'abcdefg', google displays results for similar phrase (abcdef) but not fot mine. How to make google get to know my domain name? Website and sitemaps have been submitted via Google Webmaster Tools."} {"_id": "68551", "title": "How to improve image indexing", "text": "I have a curious problem: In Google Webmaster Tools, it shows that from my sitemap 330 images submitted and only 6 are indexed. I try to put alt tags, descriptive names, everything by the book, but still no results. What can you suggest to improve image indexing by Google or any other search engine :)."} {"_id": "1574", "title": "How long did it take for your new website to be indexed by search engines", "text": "> **Possible Duplicate:** > Why isn't my website in Google search results? I have added my website to the webmaster tools of some major search engines already, and supplied the sitemap for the website. I have been waiting for exactly 3 weeks now and still no data in the webmaster tools. I have also did some link building, and the click-trough link to my website could be found in the search results from day one. How long did it take for your website to get indexed? And why does is take so long?"} {"_id": "56544", "title": "Google not showing the sitemap after submitting it", "text": "I am trying to submit the sitemap to the Google. I have already submitted the sitemap - xml path to Google 2-3 times. but when I put my website URL on Google search, it not showing the menus or it not showing indexes in search. What should I do ? Anything wrong ?"} {"_id": "47312", "title": "Multilingual subdmain on different hosting server: will this be indexed by search engines?", "text": "I have a Japanese language sub-domain: ja.example.com, which is hosted by server A. The root domain (example.com) is hosted by server B. They have completely different IP addresses: (e.g.) 205.56.78.15 vs. 56.45.03.12. Will the sub-domain be indexed by search engines? The sub-domain has been live for one month and there have been no visits by organic searches. Thanks"} {"_id": "14254", "title": "My Blog not not showing in Google Search Results", "text": "> **Possible Duplicate:** > I cannot see my website in google Hello Everyone, yesterday, i created two blog entries and published them on blogger but it is not showing up in google search results. How long do i have to wait before i can see it in google search results (before google crawls the blog). i have already shared the address on couple of sites and there are almost 80 page views from three different referrals. Any help/suggestion is highly appreciated. regards"} {"_id": "56066", "title": "How do services that make subsites (ie. new subdomains) get those sites indexed by Google and other search enginex?", "text": "Services like tumblr, github pages, quora blogs, etc. make new subdomains when a user signs up or makes a site. So the subdomains are like: blogname.tumblr.com , pagename.github.io, blogname.quora.com, etc. How do google and other search engines find out about these new sites? Is it as simple as having a sitemap.xml file being \"resubmitted\" to google? See https://support.google.com/webmasters/answer/183669?hl=en. I know that you can resubmit sitemaps to google via an HTTP request, but what about for new sites without a previously submitted sitemap? Or do the owners of the subdomain (the user who set up the new site) have to submit their site manually via https://www.google.com/webmasters/tools/submit- url ? Just getting started learning about SEO and how google crawls the web. Couldn't find anything that answer this (or I'm just bad at Google), hoping someone can answer. Thanks! Edit: I read through this: Why isn't my website in Google search results? But the question still holds for sites where subdomains are generated dynamically."} {"_id": "45239", "title": "Website Not Showing for Keywords in Google Search", "text": "I have a Website `www.rushinformation.com` 4 Months old and I always update my Website almost Everyday...But the Problem is that When I type my keyword 'Rushinformation' only in Google Search Results it is Not Shown anywhere in search Results. However when I type `rushinformation.com` or `www.rushinformation.com` it displays my site... I have submitted website to Webmaster tools and Analytics with my sitemap...So please tell me how to fix this problem."} {"_id": "22336", "title": "Site not updating on Google", "text": "> **Possible Duplicate:** > I cannot see my website in google I uploaded my site a week ago. Before, I only had \"Very Soon\" on the server. Google still shows \"Very Soon\" even though the content has been new for one week now! I pinged, resubmitted by site, but it still not showing up!"} {"_id": "56613", "title": "Site does not appear in Google organic search", "text": "My site is showing up in organic search of Yahoo and Bing. But no trace of it on Google. I am using wordpress and a plugin called Google XML Sitemap. Also signed the site in Webmaster Tools and Google Analytics. Still not appear on the site. Even typing the name of the site itself. What can be happening? It is possible the website of a competitor be blocking my website on Google?"} {"_id": "47824", "title": "Website not showing up on Google after removal of blocking robots metatags", "text": "A friend asked me to take a look at their website, which had been online for several months, but did not show up on Google. I noticed that they had a subdomain which did show up, so the query site:hisdomain.com showed some results, but all of them were from the subdomain. The query site:hisdomain.com -SubdomainTitle resulted in zero results, since it blocked all results from the subdomain, and there were no results from their main domain. The following query yielded zero results, too: site:www.hisdomain.com Then I noticed that they had some SEO Wordpress plugin active which was configured to generate `noindex, nofollow` meta tags on their pages. I removed this and the meta tag was gone. I logged in to **Google Webmasters** , submitted some **Fetch as Google** queries and told Google to **Submit to Index** them (on both `domain.com` as well as `www.domain.com`). I told my friend his site would appear on Google the next few days or weeks. **This was several weeks ago, and his site still doesn't appear.** Then I checked their **Index Status** on both `domain.com` and `www.domain.com`. Here's how it looks for `www.domain.com`: ![enter image description here](http://i.stack.imgur.com/zWs8h.png) ![enter image description here](http://i.stack.imgur.com/EXEjM.png) It's strange that the `Blocked by robots` count has always been `0`. If the `robots` metatag was the reason for Google not indexing their site, there would have been some `Blocked by robots` results, I suppose. **So, there must be another reason why Google doesn't index their site. Any ideas? How can I analyze the situation further?** **EDIT:** Here's the URL: http://www.lab25.ch. The subdomain is not directly related to the main URL: http://addon-katakunst.lab25.ch. And no, **it's not duplicate content**."} {"_id": "45545", "title": "Changing GoDaddy Webmaster tools options", "text": "A client of mine (for school) has a website hosted on GoDaddy which we wants my team to help him increase his marketability on Google, because currently, when his company name is Google searched no relevant results are displayed. So we were going to setup Google Webmaster Tools for him, so we can help him get there. Now we logged into his GoDaddy dashboard, and it turned out he already had one setup through GoDaddy, where they assigned him a gmail account. I cannot figure out why his site won't show up at all on Google. Does anyone know why it isn't, like what kind of settings can I change to help him out? Also, we were trying to change the gmail account his Webmaster Tools is associated with, is this possible? Ultimately, I'm trying to get his site seen on Google, through the Webmaster Tools, but i cannot figure out why it hasn't worked yet. Any help would be appreciated, thanks."} {"_id": "50957", "title": "My website pages not getting indexed by Google", "text": "The website `www.itztechgadget.com` isn't getting indexed at all. When I check about it in Websmater Tools, it shows 0 pages indexed. I'm not getting why is this problem happening. Googlebot fetches and submits it to be indexed but none of website's pages are indexed."} {"_id": "38773", "title": "Google not listing my web site", "text": "I have a site which is more than 2 months old and it's not showing up when you directly type the URL into Google. Here is the link of site: www.icarda.org However, the site is listed in Bing and Yahoo without any problem. Please help me to solve this issue."} {"_id": "63019", "title": "Some website pages not indexed by Google although followed the recommended steps", "text": "I have built a website (www.shareall.nl) in wordpress and did some SEO (metadescriptions, headings, links, alt text, microdata, sitemap, robots.txt, etc) using Yoast Wordpress SEO plugin. I did not squeezed everything out of it but the site ranks pretty well in Google on important keywords regarding Prezi (top 5). The problem however is, that there is a set of pages that are not being indexed and these pages are relevant to my customer? One of the pages is http://www.shareall.nl/prezi-trainingen/prezi-vervolgtraining (I cannot post more links due to my reputation on stackoverflow, but they are all about training). I know (as the moderator told me) that there is another question about this topic but I have followed all recommendations given there without success. I would like to get more ideas of what to try and hope someone can help me with this. This is what I have done so far or are some facts about the site which might be relevant: * submitted sitemap and saw most of the pages being indexed * robot.txt is not preventing pages from being crawled * manually crawled pages using google's webmaster tools and submitted them afterwards * the website itself has multiple links from other pages towards the training pages * don't have noindex set for concerned pages * domain is not new and exists for numerous years now (since 2011) * the website **is** new and is launched this year (february 2014) * I have multiple links within the site pointing to the missing pages Again any help is greatly appreciated!"} {"_id": "17266", "title": "Major Indexing problem with my site", "text": "> **Possible Duplicate:** > I cannot see my website in google Hey guys I am having problem with indexing my site http://www.loosediamondschicago.com which is in wordpress hosted on godaddy. Its about two weeks but still no search engine indexing me. **Some Points Your Should Know:** 1. The site hosted in wordpress. 2. When the site was in development I turned off the search engine visibility from the wordpress dashboard. After turned it on I checked the meta tags and i have found no noindex. So, i guess that is not the issue. 3. To my previous experience with wordpress i know wordpress is smart enough to automatically remove the **www** in url but in this case its not. 4. I have sent sitemap via google webmaster tool about 4 days ago but still no indexing. However webmaster tool is showing error for using now **www** url in the sitemap while my urls are **www**. But I believe if i dont have sitemap still my site should be crawled right? 5. The domain client bought was use before by other people but expired couple of years ago then my client again registered it. 6. I have checked if the domain in blacklisted or not. I have found the result not blacklisted. 7. I have get couple of backlinks hoping that my site will get indexed but still no hope. I did not faced this kinds of problem before. So people please advice. I am now check godaddy to find out if any crawl errors. I will update the post when i have it. **Update** The site is up for one and half month but search engine allowed about 2-3 weeks. I blocked the search engine when the site was in development."} {"_id": "26066", "title": "Creating iPad applications faster", "text": "I'm new to iPad application development, but I am an experienced Web application developer in ASP.net. I want to develop fine applications for the iPad, but I want to learn from scratch. Please suggest me some online tutorial links or guide me through the right path for starting and learning development faster. I have less time and have to learn and develop more."} {"_id": "26064", "title": "Creating an encrypted, web-based proxy", "text": "I have moved to Asia where my internet connection is censored and I'd like to check my messages from social sites which happen to be blocked. As virtually all proxy servers are blocked in this country, I've decided to attempt to roll my own encrypted proxy server. Please note, the key word here is _encrypted_ \u2014if the sniffer sees anything like `f@c3b00k` or `w:k:p3d:ia` travelling down the wire I'm had. I have a website hosted with GoDaddy (Windows with PHP 5.2 & IIS 7). Is there any way I can set up an encrypted proxy through this service? If so, how, and what open source tools are available to use?"} {"_id": "45406", "title": "Capitals in URL subdirectory not changed by htaccess", "text": "Within my WordPress website I am redirecting sub directories to internal and external links. Now, when you place a CAPITAL letter in the sub dir it does not work anymore and the website does not show the same page. * `www.mydomain.com/login/` does work properly * `www.mydomain.com/Login/` does not work properly I'am using the following in the `.htaccess` file: # Rewrite DE Domain RewriteEngine On RewriteCond %{HTTP_HOST} ^(www\\.)?domain\\.de$ [NC] RewriteCond %{REQUEST_URI} !^/+subdir/ [NC] RewriteRule ^ http://www.domain.com/%{REQUEST_URI}?lang=de [L,R=302,NC] # BEGIN WordPress RewriteEngine On RewriteBase / RewriteRule ^index\\.php$ - [NC,L] RewriteCond %{REQUEST_FILENAME} !-f RewriteCond %{REQUEST_FILENAME} !-d RewriteRule . /index.php [NC,L] # END WordPress # EXTERNAL Redirects Redirect /login http:// Redirect /Login https://"} {"_id": "9972", "title": "How can I make fonts render the same way across different web browsers?", "text": "I am building a website for a client, and we had hoped to use plain text, not images in the navigation bar. The font we are using is Century Gothic (I believe that this font is available on the majority of PCs and Macs) The problem is, that on different browsers the font renders significantly differnt. In Chrome we got it looking the way we want, but in firefox the text is smaller and bolder. Aside from writing browser specific javascript to alter the font properties, are there any other options to standardize the way the fonts are rendered cross-browser. Perhaps some library or API? Maybe its a matter of being more specific in declaring font properties? Honestly I am stuck and need help."} {"_id": "50116", "title": "Is it possible to use canonical tag in Blogger posts?", "text": "I found one of my blog post was cached by Google (`www.example.com/post.html`). I found that comment page of the post was also cached (`www.example.com/post.html?showComment=1372054729698`). These two pages are showing in Google SERP when I checked cached posts of my blog. Is it possible to use canonical tag on the post `www.example.com/post.html?showComment=1372054729698` so that Google won't penalize my original post? Is there any other ways to redirect a blog post?"} {"_id": "54041", "title": ".htaccess 301 Redirect", "text": "How do I make it so all traffic is directed for the entire site and all subdirectories from non-www to www Here is my .htaccess contents: RewriteEngine on suPHP_ConfigPath /home/vivalast order allow,deny deny from all RewriteCond %{HTTP_HOST} ^domain-name.com$ RewriteRule ^(.*)$ \"http\\:\\/\\/www\\.domain-name\\.com\\/$1\" [R=301,L]"} {"_id": "62897", "title": "URL redirection in GoDaddy", "text": "I have two domains from GoDaddy. How do I redirect automatically from http://www.domain1.com/article.html to http://www.domain2.com/blog/article.html"} {"_id": "47969", "title": "Simple RewriteRule", "text": "I'm trying to write a RewriteRule to make a simple url. I want users to be able to type enter www.example.com/somepage and have it take them to www.example.com/abc/somepage.php How can this be done in .htaccess? I've tried these to no avail: RewriteRule ^somepage$ abc/somepage.php [L] RewriteRule ^/somepage$ /abc/somepage.php [L] Your help is much appreciated."} {"_id": "68633", "title": "Google Indexing PHP pages with and without backslash", "text": "How can we use the _.htaccess_ file to only have our pages indexed by Google without the trailing slash. Right now both pages are being indexed."} {"_id": "61598", "title": "How to best redirect user to new site", "text": "I am planning the launch of a new site and trying to figure out the best way to redirect users from the old site. The old site (200+ pages) will be completely retired. I'm planning on moving the domain name to our new server, but the only page that will remain will be the index page. The site was built on a CMS that is not compatible with an Apache server. The new site has a completely different page hierarchy, page titles, and domain name. So, I'd like some guidance on the best techniques to do the following: 1. Anytime someone visits the old_domain_name.com, either from Google or other external links I'd like for them to be redirected to new_domain_name.com. i think this can be accompliched with a simple Javascript redirect from old_domain_name.com. 2. When someone tries to visit old_domain_name.com/about_us.htm I'd like for them to be redirected to new_domain_name.com/new_directory/about.htm. I'd need to do this for 200+ pages. Any insight will be appreciated. Thanks!"} {"_id": "53872", "title": "301 dynamic redirect", "text": "Please note that I would like to change the name of the server's folder (lets say \"examplefolder\") in which I have installed my phpbb site and thus I would like to do a 301 redirect in order to preserve page rank. Could you please let me know how to perform such an action in order for: www.example.com/examplefolder/ will be done: www.example.com/administrator/ Actually, I would like to redirect the URLs especially for www.example.com/examplefolder/index.php www.example.com/examplefolder/viewforum.php www.example.com/examplefolder/viewtopic.php"} {"_id": "48298", "title": "Force url to use www", "text": "I'm confused about how to force my url to use www in the address. I have a .htaccess code like this: RewriteEngine On RewriteBase / RewriteCond %{REQUEST_FILENAME} !-f RewriteCond %{REQUEST_FILENAME} !-d RewriteRule . /index.php [L] I'm changing RewriteRule . /index.php [L] with RewriteRule (.*) http://www.mysite.com/$1 [R=301,L] but I still get an error. The page isn't redirecting properly when I open `http://www.mysite.com/category`"} {"_id": "54012", "title": "How to add www domain to Google Webmaster Tools (from GoDaddy)", "text": "When I setup my domain on GoDaddy, I don't have the option to use `www` with the domain name. If want to change the settings on Google Webmaster Tools -> preferred domain to use `www`, I need to verify my domain with `www`. How do I setup the domain to use `www`? Is using `www` important?"} {"_id": "5349", "title": ".htaccess permanent redirect to www", "text": "I have a WordPress installation that is not redirecting URLs not starting with `www`. Example: `http://example.com/dir/` doesn't send to `http://www.example.com/dir/` instead goes to `http://example.com/`. How do I change the _.htaccess_ below to always redirect to the page either using `www` or not? # BEGIN WordPress RewriteEngine On RewriteBase / RewriteRule ^index\\.php$ - [L] # uploaded files RewriteRule ^([_0-9a-zA-Z-]+/)?files/(.+) wp-includes/ms-files.php?file=$2 [L] # add a trailing slash to /wp-admin RewriteRule ^([_0-9a-zA-Z-]+/)?wp-admin$ $1wp-admin/ [R=301,L] RewriteCond %{REQUEST_FILENAME} -f [OR] RewriteCond %{REQUEST_FILENAME} -d RewriteRule ^ - [L] RewriteRule ^([_0-9a-zA-Z-]+/)?(wp-(content|admin|includes).*) $2 [L] RewriteRule ^([_0-9a-zA-Z-]+/)?(.*\\.php)$ $2 [L] RewriteRule . index.php [L] # END WordPress"} {"_id": "53877", "title": "Change of URL by changing folder's name in server", "text": "If I change the folder's name in which I have my web pages, from examplefolder/ to: administrator/ so the file \"viewtopic.php\" that originally had URL: www.example.com/examplefolder/viewtopic.php it will have a new URL: www.example.com/administrator/viewtopic.php/ Can I use the 301 redirect for such a case? or 301 redirect could be used only in the case that a new folder has been created (and not rename of the existing one)"} {"_id": "45517", "title": "Issue with Apache redirection", "text": "I want to redirect `mysite.com` or `http://mysite.com` or `www.mysite.com` or any other format given my user to `http://www.mysite.com`, I'm able to achieve this by rewriting following lines in my `.htaccess` file Rewritecond %{http_host} ^mysite.com RewriteRule ^(.*) http://www.mysite.com/$1 [R=301,L] But I want do this from `Apache`,So I've added following line in `Virtual host conf file of the site` and removed above two lines from `.htaccess` Redirect 301 / http://mysite.com/ But whenever I'm trying to access the site following error is displyaing, Error 310 (net::ERR_TOO_MANY_REDIRECTS): There were too many redirects. where I'm doing the wrong ?"} {"_id": "30772", "title": "Add more than one host to user login privileges in phpMyAdmin", "text": "![enter image description here](http://i.stack.imgur.com/SjAOi.png) As the title reads, I want to add more than one host into the _Host_ field in phpMyAdmin's user privileges, to allow a user to login from a number of different hostnames. How can I accomplish this?"} {"_id": "26068", "title": "Is there a way to hidden facebook advertisements on application iframe?", "text": "Probably it's not possible, but i need to confirm it. If someone knows it, tell me please :) thanks."} {"_id": "43162", "title": "Tracking multiple domains under subdomain", "text": "I need to setup a Google Analytics code for one website with multiple domains, but they are in form of subdomain.**foo**.com subdomain.**bar**.com Subdomain is the same (it doesn't really matter that much), but main domains are differents. So now I use: _gaq.push(['_setDomainName', 'subdomain.foo.com']); and _gaq.push(['_setDomainName', 'subdomain.bar.com']); is this correct?"} {"_id": "43161", "title": "Multiple PHP 5.x version for different directory", "text": "Is this thing even possible because I searched every where but all I could find was this one post ? Using .htaccess or using `AddHandler`/`Action` MIME module Directive ? I have compiled all Apache, PHP 5.2 and PHP 5.3 on CentOS at path `/usr/local/apache`, `/usr/local/php52` and `/usr/local/php53`. I have the `libphp5.so` modules in my respective php folder by using symlink at `/usr/local/php` and `LoadModule` directive. I am able to switch PHP versions too but now I want that different directories to run with different PHP 5.x version. Is it possible ? I don't mind recompiling PHP too."} {"_id": "2539", "title": "Are there any useful tools to mirror a mailman mailing list as a forum?", "text": "We have a mailman mailing list however as we all know this is not very user friendly in terms of searching the archives. I am looking at a way to enable the continued functionality of mailman while having a forum linked to it for a more friendly user-interface approach. Is there a forum application that lets you mirror the mailman service so that posts to mailman are sync'd into the forum and posts to the forum are sync'd to mailman?"} {"_id": "43167", "title": "Buying a top level domain specific to my industry, will it improve my seo ratings?", "text": "Currently we run a few websites selling products in the travel industry using .com and .co.uk extensions. Would our search rating improve if we were to buy one of the .travel domains ? So we would replace, or more likely supplment the following mycompany.com myCompany.co.uk with : mycompany.travel Bing and google rankings are, unsurprisingly, our main concerns."} {"_id": "43164", "title": "Server comparisons - Comparing servers based on their specs", "text": "We are busy creating a server hierarchy manager (possibly not the greatest name). What this is intended to do is determine which of our servers is the best for doing work. We have identified the following points as the most important criteria which we would like to base our decisions on: * RAM * OS * No of cores We had considered including architecture, however with the amount of RAM we will likely be using in our x64 servers the amount of RAM should be able to indicate the architecture too. Considering the example of having 3 servers available running on different operating systems, different amounts of RAM, different numbers of cores etc. how would we figure out which is the best server to designate as the \"primary\" server? What we have considered at the moment is creating a simple metric whereby each section (RAM, OS and cores) is represented by a value out of 1 (where 1 is our recommended requirements) and comparing the servers this way. Is this a good approach to the problem? Does anyone have any better ideas or know of any tools that can assist? EDIT: Let me explain further. These servers are basically just processing engines. They will all talk to the same DB. The scenario is this: We could remove or add servers at any time, say for example the primary, and the rest would have to figure out amongst themselves who the primary is. When the primary is readded the hierarchy should realise that there is a new primary again. We have the mechanism for this in place already. My question is with regard to the metric. In terms of determining what would be the best primary server, are there any other relevant factors to consider?"} {"_id": "64940", "title": "What is the correct DTD for Schema DATA placement in XHTML pages?", "text": "We get W3C validation error after placing Scheme Data in our website which was built in XHTML. > Line 198, Column 41: there is no attribute \"`data-id`\" > > > > (Link to the validator result) Will changing the DTD solve this? We use, "} {"_id": "61731", "title": "Is inline JavaScript good or bad?", "text": "I've currently switched from WordPress to Yii framework and an external developer is rebuilding my site. One thing I notice is that each time he invokes AJAX / jQuery the Yii way, it embeds ~~inserts JavaScript inline~~ in the web page. Although it seems the JavaScript code is placed below the footer, it's still inline. In general, it seems that JavaScript code is more frequently being put inside the webpage. I've always been thought that JavaScript scripts should, as much as possible, be placed in JavaScript files. Is this a coming \"new\" trend? Or should I still try to keep JavaScript code in separate files?"} {"_id": "2536", "title": "Is it possible to make all html links to share the same title?", "text": "The thing is I have a table containing ~2000 rows where each row contain a link that has the same title e.g. My 1st link [...] My 2000th link I'm wondering if it is possible not to have this very long title string 2000 times !?"} {"_id": "13485", "title": "How to restrict the download of all files in a folder?", "text": "I have several .pdb files, which should be viewed in my website by clicking a link on a page. But when a pdf file opens, in the address bar it is visible the directory where all .pdf files reside. SO if the user will write in address bar http://mySite.com/myDir he will see and be able to download all files at once. But I don't want! When I [chmod 644 myDir], the files become not visible even for my site - i.e. you can not open them by clicking the link I have mentioned. How to solve this problem?"} {"_id": "15838", "title": "How can I prevent people from looking at a listing of files in parent directory if I haven't uploaded index.html?", "text": "> **Possible Duplicate:** > How to restrict the download of all files in a folder? I haven't uploaded index.html or index.php to my root directory. How can I prevent people from looking at a listing of files in parent directory? ![](http://oi56.tinypic.com/sc739e.jpg) Also, is it possible for people to obtain a list of all the files in the root directory once I upload index.html? I'm currently using .htaccess and htusers to prompt someone to enter a username and password when they try to access any file in the root directory. This may sound like a weird request but would it be possible to have them come to the site (without an index.html) and just have them not see the files? All it would say on the page would be the following: **Index of/** _Apache Server at mysite.com Port 80_"} {"_id": "25853", "title": "How to make upload images harder to access from url?", "text": "> **Possible Duplicate:** > How to restrict the download of all files in a folder? I am wondering how Facebook, Google+, Twitter and so on prevent images users upload, from being navigated like: http://interactionhero.com/images/ .... I have an application where users upload there folder /profile (chmod 777), and if I go to www. ___ _.com/profile , all the images are accessible. Is there a way I can prevent this or improve it?"} {"_id": "67538", "title": "Implementaiton of rel=\"prev\" and rel=\"next\" inside a site code?", "text": "I searched a lot regarding my question, but I never found the exact piece of information explaning how to implement rel=\"prev\" and rel=\"next\" inside the code of a certain site? I'll be very greatful if some could give me good imformation on this matter."} {"_id": "68141", "title": "Adsense value per click", "text": "On my website, I have seen a significantly decline in revenue from adsense in recent weeks. My amount of clicks is the same as prior months, but the payout is way less. Does anyone know or can direct me to how google determines the payout per click?"} {"_id": "58031", "title": "Finding different Windows XP Service Packs via User-Agent string", "text": "I am planning to get rid of the SHA1 certificate signing algorithm in our certificates and switch over to SHA-256. Problem is, that Windows XP (before SP3) is not supporting this. Therefore I need to know how many people are still using these versions. Is there a way to seperate Windows XP SP1 and SP2 from SP3? I know that SP2 brought the \"SV1\" (Security Version 1) token to it's version string, but what about SP1 and SP3? Is there a way to distinguish these versions?"} {"_id": "51867", "title": "Does a lead-to screen with AdSense ad conform to Google's rules?", "text": "Re: Google's ad placement policy I have noticed that when clicking on some Forbes links, I am taken to a screen with an ad in the middle - at the top there is a link to skip the ad. Upon clicking on the skip link I am taken to the article I want to view. I want to implement something similar on my sites, where, when clicking on a search result, a the results window first displays one AdSense add on the screen with a similar UI as what I saw on Forbes. Currently, when a user clicks on a result, a new tab/window opens with the result. What I am proposing is that before the result appears, the screen displays a \"Continue to result\" link at the top in large letters, and in the center of the page, \"Advertisement\" with the ad below. This is the only popup that is user initiated and there are no other popups on the site. Navigation elements are not modified in any way. Will I get penalized by Google for implementing this?"} {"_id": "58034", "title": "How to get the number of visits by page, for pages from a website?", "text": "How to get the number of visits by page, for _all_ pages from a website? (I use Google Analytics, and know how to create a report for that, but I only have the 10 first pages)."} {"_id": "58037", "title": "Submitting sitemaps with 2 different websites but the same content", "text": "I have two domains running on the same server with the same IP (same content). For example: the sitemap of `example1.com` represents one website and `method1.com`, `method2.com` represent many pages will come. So can I use only one sitemap for both or make different sitemaps? http://xyz.com/ monthly 1.00 http://abc.com monthly 0.80 If I'm working on the second URL it's affecting the homepage also. Whether I need to create 2nd URL for sitemap or homepage url?"} {"_id": "57419", "title": "My email based support form on my shared host is being blocked by Gmail", "text": "On my site (developed with `ASP.NET`, if that matters) I have a support form, which is nothing more than three fields (i.e., name, email, and message) plus a Send button. The form code generates emails from provided user input and sends them to: `support@mysite.com`. The form uses this address to send the messages: `sender@mysite.com`. All the email for `mysite.com` is handled by Google Apps for Business. The problem is: Google recently started marking emails sent from `sender@mysite.com` as spam: Delivery to the following recipient failed permanently: support@mysite.com Technical details of permanent failure: Google tried to deliver your message, but it was rejected by the server for the recipient domain mysite.com by aspmx.l.google.com. [74.125.25.26]. The error that the other server returned was: 550-5.7.1 [ip-address-was-here 7] Our system has detected that this message is 550-5.7.1 likely unsolicited mail. To reduce the amount of spam sent to Gmail, 550-5.7.1 this message has been blocked. Please visit 550-5.7.1 http://support.google.com/mail/bin/answer.py?hl=en&answer=188131 for 550 5.7.1 more information. sz7si15854448pab.203 - gsmtp Obviously, visitors on my site **can** put spam messages in my support form. Anyway, I would prefer that these messages to end up in my Spam folder (or better the Inbox!) of `support@mysite.com`. So, what can I do to prevent Google from blocking `sender@mysite.com`? Shall I use a 3rd-party service for the support form? Or shall I tweak my site / form code somehow? My site is on shared hosting if that matters."} {"_id": "48616", "title": "Will Google index images from HTML select tag", "text": "If i have code like this: > > > Will Google index images from the select tag?"} {"_id": "48161", "title": "Will my domain get blacklisted by auto-replying to spam emails?", "text": "At the moment, I've just set up an auto-reply function for our career email address. One problem is that we cannot differentiate between an email applying for a job with an advertisement or a spam email. I wonder if eventually our domain may get blocked because we auto send out so many emails everyday. I have no idea how blacklisting mechanism works. If this situation can really happen, I'd be very grateful if you could suggest a solution for me :)."} {"_id": "38547", "title": "Buying Backlinks", "text": "I came across a website the other day that was selling backlinks. The site was well designed and promised some results for a nice low price but not too low. After a couple of minutes it started to sound similar to buying email marketing list which I know is not something you do. I assume that buying backlinks is considered a black-hat SEO trick and should be avoided. Am I wrong in my assumption?"} {"_id": "17167", "title": "ISAPI_Rewrite and asp.net events upon postback", "text": "I have a rewrite rule to rewrite `www.mysite.com/default.aspx` to `www.mysite.com`. RewriteRule ^default.aspx$ / [NC,R=301,L] On that site i have an asp.net server `Button`. When i click the button it tries to go to postback to `default.aspx`, but gets `301`'d to the root. This prevents the button's click event from triggering. How can i redirect `default.aspx`, without breaking my button's event?"} {"_id": "38817", "title": "How do I change pages registering as 404 to 200", "text": "I have this problem. After relaunching my site: http://www.kgstiles.com, traffic dropped immensely(about 60%). After troubleshooting for a week and a half - losing thousands of dollars off of lost traffic in the process, I found that Google was getting a 404 error at the end of many of my 301 redirects(so it wouldn't index the new pages). Most of of the pages, though, would register in my browser. They registered as a 404 error in Google's index as well as a 404checker. So my first question is: could this be what's causing my loss of traffic? and second: how do I fix it? I'm desperate! Any help is appreciated! # BEGIN s2Member GZIP exclusions RewriteEngine On RewriteBase / RewriteCond %{QUERY_STRING} (^|\\?|&)s2member_file_download\\=.+ RewriteRule .* - [E=no-gzip:1] # END s2Member GZIP exclusions # BEGIN WordPress RewriteEngine On RewriteBase / RewriteRule ^index\\.php$ - [L] RewriteCond %{REQUEST_FILENAME} !-f RewriteCond %{REQUEST_FILENAME} !-d RewriteRule . /index.php [L] RewriteEngine On RewriteRule ^moreinfo/(.*)$ http://www.kgstiles.com/moreinfo$1 [R=301] RewriteRule ^healthsolutions/(.*)$ http://www.kgstiles.com/healthsolutions$1 [R=301] RewriteRule ^(.*)\\.html$ $1/ [R=301,L] RewriteRule ^(.*)\\.htm$ $1/ [R=301,L] # END WordPress"} {"_id": "5079", "title": "Can drupal do this?", "text": "I am a php developer using mostly CakePHP, magento, and wordpress. I want to create a community driven website aimed at the volunteering community that will let users sign up, create profiles, add previous voluntary positions, let organisations sign up and post jobs, etc. Is this something that drupal can handle? Is this what drupal was built to do? I'm just wondering how drupal deals with custom methods. Say I wanted to have a user request a reference from someone, I'd have to write methods that did this. Would that be possible in drupal?"} {"_id": "13902", "title": "When designing a website, how do you usually start?", "text": "In the past I've done some HTML, CSS and PHP coding, but I've only ever worked on things that were already \"finished\" and I've never started a website from scratch. However, I recently purchased a domain and some hosting for cheap, and I'm hoping to improve my HTML, CSS, and Javascript skills so that I can become a bit more marketable as a web developer/webmaster. Typically, what process do you use to start a website? Do you draw the layout on paper and then do the code? Or do you just start coding and tweak the output to your liking?"} {"_id": "48405", "title": "Getting started with a website", "text": "So I know `HTML`, `CSS`, `Javascript`, `jQuery`, `PHP`. But I'm not sure how to put it all together to begin making an actual website. I'm not looking at something too big here since I'm just a beginner, but I need to start somewhere. More specific stuff: 1. In what ways do I carry on my layout for which page in my website, without having to do it again for each page? Not only layout, but also stuff like menus which have to be on every page. 2. Should I use software like `Drupal`, or should I start off alone and start worrying about that later? 3. Is there a place where I can read about all this stuff?"} {"_id": "17163", "title": "Facebook Promotion guidelines", "text": "Hi I want to build a Facebook `promotional/competition` page and just want to make 100% sure that I am within the laws governed by the Facebook Promotional Guidelines. Here is a link to the guidelines http://www.facebook.com/promotions_guidelines.php I want to build a video voting page on Facebook. Facebook users cannot enter the competition. They can however vote for their favourite video and on voting post a message to their wall. The video with the most votes will be declared the wining video. edited A random voter could also stand a chance on winning a prize. here is an extract from the guidelines that i am unclear about. ` 3\\. You must not use Facebook features or functionality as a promotion\u2019s registration or entry mechanism. For example, the act of liking a Page or checking in to a Place cannot automatically register or enter a promotion participant. This is pretty self explanatory. Thus we cannot ask someone to \u201clike\u201d or vote for their favourite video in order to enter or a stand a chance to win. We can ask them to do these things, but not as a condition to be part of the competition.`"} {"_id": "52570", "title": "linking several separated schema.org
    s together?", "text": "I'm adding schema.org semantic markups, and would like to link several of which are in quite different parts of webpage together. This works when I specify them as parent/child:
    11 km
    first event 2001-01-01
    However, I don't know how to link them when they're separated by mountain of HTML:
    some event 2002-02-02
    22 km
    I guess I should use \"itemref\" or something to link related scopes, but I can't get it to work. Looking at similar questions did't help in this case. UPDATE: note that I'm specifically looking for solution that works via some kind of referencing and DOES NOT not work via nesting of any type. Also, ordering of second (non-working) example must stay the same (Event first, ExerciseAction second). (There are several reasons for that outside of my control). Specifically, \"it works\" is defined as at least google rich snippets tool showing that the items ARE linked together (as it does for that first working example)"} {"_id": "49627", "title": "Is there a way I can redirect search engine traffic by U.S. State or Zip Code?", "text": "I work for an e-commerce company that operates solely within the United States. However, the price of goods and availability of our products depends on what State or even Zip Code you reside in (we sell alcohol). I know that there are techniques for localized SEO on a national scale, but I was wondering if there is anyway we can do it on a Provence/State scale. I have been searching for an answer but everything seems to be on the international scale. For example, we would prefer it if Google/Bing/Yahoo knew the location of their visitors and could point them to: `oursite.com/?state=CA` Or `oursite.com/?state=TX` Or `oursite.com/store/product1?state=MA` Is there anyway to convey this information to the search engines with either canonical tags or another approach?"} {"_id": "5072", "title": "Where can I get high quality vector graphics for web design?", "text": "I am looking for a place where I can buy web 2.0 (cartooney mac-like) vector graphics for use on my sites. Where's the best place to go?"} {"_id": "65619", "title": "Protect a URL from being used in an iFrame except on a single site", "text": "I need to pass URLs to a client and they would embed those URLs on their site within an Iframe. I was wondering what's stopping the user(who is visiting the client site) from copying the URL from the IFrame source to use it somewhere else? I want the URL to be used only to the client that I provide to. Update: I was thinking of adding an extra parameter in the URL, that I am passing to every client, to identify which client it is and if the request is coming from the same client, but I can't find any way to have an access to the information where the request is actually coming from? In short IFrame doesn't have access to it's parent."} {"_id": "68135", "title": "Web user authentication methods", "text": "Is there any method besides using a DB with username and password for authenticating a user?"} {"_id": "22834", "title": "Getting statistics on traffic sources for website usage", "text": "Does anyone know a valid source where to get global statistics on how users access a webpage? E.g., Site Referrer - 48% Search Engines - 41% Direct Entry - 1%"} {"_id": "5074", "title": "In Drupal, how do auto-list headlines from a news section on the homepage?", "text": "**Requirements:** * Headlines also go to an RSS feed on the home page * Needs to be a link below the listing to \"View More News\" (meaning it goes to the news section)"} {"_id": "5075", "title": "What's the best way to version CSS and JS URLs?", "text": "As per Yahoo's much-ballyhooed Best Practices for Speeding Up Your Site, we serve up static content from a CDN using far-future cache expiration headers. Of course, we need to occasionally update these \"static\" files, so we currently add an infix version as part of the filename (based on the SHA1 sum of the file contents). Thus: styles.min.css Becomes: styles.min.abcd1234.css However, managing the versioned files can become tedious, and I was wondering if a GET argument notation might be cleaner and better: styles.min.css?v=abcd1234 Which do you use, and why? Are there browser- or proxy/cache-related considerations that I should consider?"} {"_id": "60443", "title": "Avoid duplicate link to same page with #", "text": "I have one link in main nav `example.com/photo` but i want to add more links on page, which go on same page as this nav. Will this `example.com/photo#` help me avoid duplicate link to same page?"} {"_id": "60442", "title": "Best practice for using 'www.domain' vs. naked domain?", "text": "DreamHost gives me the option to use `www` for my site's URL, to remove `www` from the site's URL (i.e., \"naked domain\"), or let the visitor decide. Is there a best practice for this? I notice Google treats the naked domain vs `www` as separate sites (they are both registered in Google Webmaster Tools). I would prefer they are treated as one unified site by allowing Dreamhost to do a moved permanently redirect on one or the other. What research is available that shows one style is more comfortable to users than the other?"} {"_id": "60441", "title": "Check to see if emails are getting automatically sent", "text": "I have a Drupal site that is supiosed to send emails every time new content is created. The emails are sent to admin users so that they can check the content. None of the emails are making it to the inbox. I would like to know if there is a way to log and view all of the emails that are being sent from my server. I am currently running: * Apache/2.2.22 (Debian) * PHP Version 5.4.4-14 * Postfix MTA (may not be working, how can I tell?) I would also like to do this to make sure that my site is not sending out junk."} {"_id": "49625", "title": "Nginx Browser Caching using HTTP Headers outside server/location block", "text": "I am having difficulty setting the HTTP expires headers for Nginx outside of specific server (and then location) blocks. What I want is to something like the following: location ~* \\.(png|jpg|jpeg|gif|ico)$ { expires 1y; } But not have to repeat it in every single server block, because I am hosting a large number of sites. I can put it in every server block, but it's not very DRY. If I try to put that into an HTTP block or outside of all other blocks, I get \"location directive is not allowed here.\" It seems I have to put it into a server block, and I have a different server block for every virtual host. Any help/clarification would be appreciated."} {"_id": "60445", "title": "How to create a seperate blog page on my site with a different theme", "text": "My website is currently setup as a blog, where there are categories and post within them. However I want to create a separate page where I can discuss news, ideas, and thoughts, while having a different theme."} {"_id": "60448", "title": "Shared Gmail account and privileges management", "text": "I'm working with a company from abroad and I get paid on commission upon customer creation. This means that till know I've been managing myself the company email, as I was afraid to get tricked (eg: If the company deals directly w the customer and deletes the emails, I wouldn't find out and wouldn't get paid). My question is, have somebody of you worked sharing a single Gmail account? And if so, is there a way to set privileges in the account? Eg. user A and create, read and destroy, but user B can only create and read. That way I could control the email flow and speed up the selling process, as right now when the customer ask a question that I can't answer, I have to contact the company, wait for the answer and answer back, which is a total mess. Anyway, any light shed upon this particular issue would be more than appreciated!"} {"_id": "61764", "title": "Are there any problems with pages from the same site using different doctypes?", "text": "After a re-design of some pages in our site, we have the new re-designed pages with doctype HTML5 and all the rest pages (that are not yet redesigned) with doctype XHTML 1.0 Transitional. Does that create any problem? (SEO or otherwise)? There is a similar question (Is it bad to have a mix of HTML 5 and XHTML pages within one website from SEO perspective?) but it only touches the SEO subject. I am wondering if any other kind of issue can arise from this situation."} {"_id": "31796", "title": "Canonical URL for a home page and trailing slashes", "text": "My home page could be potentially linked as: http://example.com http://example.com/ http://example.com/?ref=1 http://example.com/index.html http://example.com/index.html?ref=2 (the same page is served for all those URLs) I am thinking about defining a canonical URL to make sure Google doesn't consider those urls to be different pages: (relative) (trailing slash) (no trailing slash) Which one should be used? I would just slap `/` but messing with canonical seems like a scary business so I wanted double check first. Is it a good idea at all for defining a canonical URL for a home page?"} {"_id": "9560", "title": "Example sites which use UCC certificates", "text": "Can anyone point me to a few sites that make use of a UCC (SAN) certificates? I tried to search for this but found a lot of information about UCC certficates without any examples. As a sanity check before buying/configuring a UCC certificate, I wish to do some basic testing to determine exactly how the certificate will look in different browsers. Yes, I realize I could just use makecert instead. I would rather just look at them in the wild."} {"_id": "60585", "title": "local install of wp site brought down from host - home page is ok but other pages redirect to wamp config page", "text": "local install of wp site brought down from host - home page is ok but other pages redirect to wamp config page. I got all local files from host to www dir under local wamp. I got database from host and loaded to new local db and used this tool to adjust `site_on_web.com` to \"localhost/site_on_local\" now the home page works great and can login to admin page but when click on reservations page and others of site then site just goes to the wamp server config page even though the url shows correctly as `localhost/site_on_local/reservations` my htaccess file is this # BEGIN WordPress RewriteEngine On RewriteBase / RewriteRule ^index\\.php$ - [L] RewriteCond %{REQUEST_FILENAME} !-f RewriteCond %{REQUEST_FILENAME} !-d RewriteRule . /index.php [L] # END WordPress and rewrite-module is checked in the php-apache-apache modules setting. now when I uncheck the rewrite-module is checked in the php-apache-apache modules setting or I clear out the whole htaccess file then the pages just goto Not Found The requested URL /ritas041214/about-ritas/ was not found on this server. Please help as I am unsure now about my process to move local site up and down and be able to make it work and without this I am lost..."} {"_id": "9563", "title": "How can I monitor a website for malicious changes to the files", "text": "I had an occasion recently where our website was compromised - a link farm was added to a couple of the pages on one occasion, and on another occasion, a large and nasty aspx file was put on the server. I won't mention the host's name (Hostway), but I was pretty annoyed that someone was able to do this. No, it wasn't a leaky password - around 10 sites hosted by HW with consecutive IP addresses got trashed. Anyway. What I need is a utility or service (preferably free) that takes a snapshot of my websites contents, and then regularly monitors the files (size and datestamp) for unauthorized changes or additions, and alerts me. I've used web services that monitor one file for changes, but I'm looking for something a bit more aggressive."} {"_id": "57435", "title": "Domain Creation - Any Difference?", "text": "Is there any difference when creating domain hosting `.com` in Europe or America? Can it still be reached by both continents at the same weight?"} {"_id": "9566", "title": "Best SEO practices for mobile URLs: 301, rel=canonical, or something else?", "text": "I am developing a site with a mobile version and am trying to figure the appropriate way to manage the URLs for search engines. So far I've considered: 1. Having a separate mobile site (m.example.com) with rel=\"canonical\" links to the regular site. 2. Putting both the mobile site and full site on one URL (example.com), and doing user agent sniffing. 3. Another opinion: > Spencer: \"If you have a mobile site at a separate location or URL, you > should 301 redirect each and every mobile page to its corresponding page on > your main website. Employ user agent detection so that the mobile optimized > version is served up if someone's coming in from a hand-held. - > http://developer.practicalecommerce.com/articles/1722-Mobile-site- > Development-Best-Practices-for-SEO-Usability Both 2 and 3 make it hard for a user who wants to switch to the full site or mobile site manually, but I'm not sure 1 is the best alternative. What's the best way to write URLs for a mobile site?"} {"_id": "9568", "title": "Is it possible to mod_rewrite BASED on the existence of a file/directory and uniqueID?", "text": "My site currently forces all non www. pages to use www. Ultimately, I am able to handle all unique subdomains and parse correctly but I am trying to achieve the following: (ideally with mod_rewrite): when a consumer visits www.site.com/john4, the server processes that request as: www.site.com?Agent=john4 Our requirements are: * The URL should continue to show www.site.com/john4 even though it was redirected to www.site.com?index.php?Agent=john4 * If a file (of any extension OR a directory) exists with the name, the entire process stops an it tries to pull that file instead: * for example: www.site.com/file would pull up (www.site.com/file.php if file.php existed on the server. www.site.com/pages would go to www.site.com/pages/index.php if the pages directory exists). Thank you ahead of time. I am completely at a crapshot right now."} {"_id": "931", "title": "How to deal with multiple step forms where one of the forms are not submitted", "text": "Say you have a submission process that requires two forms. The submission of the 2nd form triggers an email. What do you do if the second form isn't submitted? Say the user forgets or closes the window by accident. My only solution was to update the DB with a flag from the submission of the 1st form that identifies an email has not been sent. Submission of the 2nd form will trigger an event to send an email and update the DB with a flag that identifies an email has been sent. A cron job will run periodically to check for any flags for unsent emails, gather the information, and send an email. Is this the best way of dealing with the issue? Thanks. **Edit: Clarity** Form 1 User submits general data Message 1 User presented with a choice. User needs to choose to \"pass\" or \"accept\". If the user submits the \"accept\" form (Form 2), an email is sent. If the user submits the \"pass\" form (Form 2), they are presented with another message with another choice (Form 3). If the user does nothing, the last choice provided is their default choice, but at this point an email still needs to sent. I hope that's more clear."} {"_id": "35529", "title": "Allowing temporary access to cpanel for freelancer", "text": "I'm currently using a2hosting's cpanel on a shared hosting account. I'd like to temporarly give access to a freelance developper so he can upload some php files, csv files and create tables in my database. Does anyone know how I can do that? I'd like to be able to revoke their access after a few weeks. Thanks for your time."} {"_id": "935", "title": "Multi site wordpress setup", "text": "Currently my company has 3 blogs and what I did was to install three instances of wordpress over Apache/MySQL, in different directories. The problem is that I have a Slicehost VPS with 256Mb RAM with Ubuntu8.04 and MySQL is crashing Linux or making it very slow and unresponsive. Is there some kind of optimal setup for this scenario? I know that my server is too _cheap_ but I'm not sure either if an upgrade to 512 will fix things. I'm thinking about migrating to nginx, but what about MySQL? Is there any solution to this? Is this the right site to post this question or is it `serverfault`? Thanks"} {"_id": "10740", "title": "can anyone reccommend a Google SERP tracker?", "text": "I want to track my website's position in Google's search results for around 50 keywords/phrases and am looking to a nice webapp to automate this process? Ideally i want to see pretty line graphs for my keyword/position. I'm currently trying: Raven Tools and Sheer SEO but am not particularly impressed with either... I guess my budget is upto \u00a325-30/$30-40 for a decent bit of software"} {"_id": "14520", "title": "How to protect PDF file from indexing?", "text": "I have a link to PDF document on a public web-page. How do I prevent search engines from indexing this link and PDF document? The only idea I thought of is to use CAPTCHA. However, I wonder if there are any magic words that tell a search engine to not index the link and PDF document? Options using PHP or JavaScript are also fine. Just to make it clear. I do not want to encrypt PDF and protect it with password. I just want to make it invisible for search engines, but not for users."} {"_id": "5382", "title": "Mobile site link to youtube video, close app after video finished?", "text": "I'm linking to Youtube videos on my mobile site. On iPhone, when you click on a youtube link, it begins playing using the Youtube App When the video is finished, the Youtube App is still open--you have to close it and reopen the browser to get back to where you were. Is there a way to make it go back to the browser when it is finished playing?"} {"_id": "58960", "title": ".htaccess - delete characters from URL and redirect", "text": "I need help modify a file _.htaccess_ delete characters from URL and redirect. The file _.htaccess_ is currently empty. 1. I want to change the word `article` to the word `topic` in all the URLs. The current URL: www.example.com/index.php?action=article&cat_id=011&id=37&lang=en 1. I want to delete the last characters of the URLs `&lang=en` and some URLs `&lang=` The current URLs: www.example.com/index.php?action=article&cat_id=011&id=37&lang=en www.example.com/?action=article&cat_id=011&id=37&lang= So the final URL: www.example.com/index.php?action=topic&cat_id=011&id=37 www.example.com/?action=topic&cat_id=011&id=37"} {"_id": "33098", "title": "User-agent identification and SEO crawler database", "text": "I am requested to analyze the traffic log of a site. In particular I have to identify the crawlers starting from the collected `user agent` values. I know there are 'trap' links that you can use to distinguish the crawlers from the human beings. Now I would only analyze the `user agent` values. Now the question. Is there a public catalogue or a library of web crawlers? **Edit** Here is the second question. There are also _a lot_ of empty user-agent in my traffic records. Is an empty user-agent header related to a crawler or to an authomatic process?"} {"_id": "51406", "title": "Google not crawling my website", "text": "I own a website http://citrusbug.com. We did not have proper content before so to prevent search engine from crawling our site we added a _robots.txt_ with code to prevent crawling. Now we had updated our sites content and want to get it crawled, but Google is not crawling it yet. We removed the code from _robots.txt_ to prevent Googlebot to crawl, but still it is not getting crawled. We almost did this change before 20-21 days but still Google is not crawling our site and if search I search on Google it displays: > A description for this result is not available because of this site's > robots.txt \u2013 learn more. Kindly guide me, what I need to do for this?"} {"_id": "54208", "title": "How to include Facebook's \u201cSend To Mobile\u201d button in my Facebook App?", "text": "I have developed a simple Facebook Game listed in the facebook App Center. It has a \"Send to mobile\" button which basically sends the download link to the phone after prompting the user to enter his/her mobile phone number. I want to include that button in my Application. But i can't find any API to include it. How should i include it in my game? Its HTML code is: Send to Mobile"} {"_id": "5387", "title": "Google Search results not showing up", "text": "I have added a custom Google Search to a site and while the search box functions the results page is pathetically empty. Here is the search box code:
    Here is the display code:
    The site: http://www.nu-living.com The search results page: http://www.nu-living.com/info/search_results I have done this on several other sites with success, but this one is just not cooperating. I would appreciate any assistance in getting this to work correctly"} {"_id": "14844", "title": "How do i Seperate WebHosting Space and Email Hosting Space", "text": "I have a VPS account from knownhost. I have clients who have their website hosting on my server. Some have subscribed 50MB, 100MB, 500MB or 1 GB as per their requirement. They are also allowed to create email address in their accounts but the email address's capacity will be shared with web site's space. Now I would like to separate that web hosting space and email account's space. So What is it called and from where can i start?"} {"_id": "14661", "title": "How to configure Drupal email functionality on GoDaddy server", "text": "I have Drupal 6 installed on a GoDaddy server. The email functionality used to work fine, but about a month ago they moved my site to their newer generation of server, and I discovered today after setting up some triggers that Drupal is no longer sending emails, including contact form emails. Can anyone direct me to a concise guide to configuring Drupal email functionality in general, or specifically for GoDaddy? I tried looking on GoDaddy's support site but couldn't find anything useful (big surprise there)."} {"_id": "18561", "title": "How would I make something like jQuery plugins subsite?", "text": "I'm currently creating a website. Part of the website I want people to be able to have their own sub site that they can add content (text files with descriptions) to (think jQuery plugins page). Is there any package that does this? If not, what would be the process of learning how to do this? I'm using HostGator so I can do it in PHP, Perl, ROR, and CGI (not sure what that is, I think it's Perly). I would rather PHP as I think it has the lowest learning curve, and I've dabbled in it already. My goal is to have a site where I can post content on the main page and other people will have their own directories (example.com/user/johnDoe) where they can post file types of my choosing but I'm still the site master and they're just my minions."} {"_id": "11835", "title": "Google Custom Search can't find anything other than the main page", "text": "I added a Google Custom Search to my website several weeks ago, and it has been unable to find anything other than the home page of my site. I have manually submitted a sitemap to the custom search, and to the webmaster tools (which for some the custom search can't find, but it says I should add one). I understand there are not a lot of details here, but I don't have much to go on. I've double checked my robots.txt, there's nothing there that's preventing the indexing of my pages. EDIT: Actually, does the google custom search work any differently than a regular google search? I assumed that it indexed separate from the regular google search, but I guess it's possible that both a google custom search and the normal google search draw from the same pool of pages. In that case the only way to get custom search to find my pages is to get Google to crawl them...which pretty much makes the custom search useless if it can't find the most recent things I've posted."} {"_id": "11833", "title": "APC on 1and1 shared hosting?", "text": "Is anyone aware of a way to use APC on a 1and1 shared hosting account? They don't have the pecl extension installed by default."} {"_id": "14842", "title": "How to setup up A record for GitHub pages for NearlyFreeSpeech.net", "text": "I own the domain **zenstealth.com** and I have decided that the easiest way for me to \"do\" a blog is via GitHub pages and Jekyll, which is already built- in in GitHub pages. I've done that already, and for now I've already setup a CNAME record so that my GitHub pages repo **zenstealth.github.com** redirects to **blog.zenstealth.com**. What I want to do is instead of using a sub-domain for the blog, I'd like to make it use the top level domain **zenstealth.com**. The GitHub Pages instructions say to the set an A record to the ip _207.97.227.245_. The problem in NearlyFreeSpeech.NET (let's call it NFSN for short) is that it already already sets A records to files which are hosted directly in NFSN, and I have absolutely no idea on how to override this."} {"_id": "18564", "title": "SEO in image's path?", "text": "Does including keywords in the image's _folder_ path of any importance at all for SEO ? (Keywords for the image's _filename_ are important.) / **foobar** /images/keyword.jpg vs / **keyword** /images/keyword.jpg"} {"_id": "22285", "title": "How does Google find a domain with no links to it?", "text": "I recently registered a new domain, pointed it to my existing server, and set up a minimal page just saying \"test\" and nothing else. I just discovered tonight that the page is already indexed in Google! There are no links to the site (I haven't even told anyone about the domain since I haven't done anything with it yet). Is Google trawling WHOIS records or something?"} {"_id": "49895", "title": "Is it possible for Google to discover a website that is not linked to anywhere?", "text": "My colleague claims that he has seen cases where websites were \"discovered\" by Google without having a link to them anywhere on the web. For him, this means that the only way to not being indexed (for example when a website being developed is already online, but not ready to be used by customers yet) is to restrict the access to the files on the server with a password. How can Google find a website without following links? If for some reason the site is discovered, isn't robots.txt and a global `` (on every page) enough?"} {"_id": "48697", "title": "Automatic HTML clean up", "text": "What are some of the most popular tools for automatically cleaning up HTML code to its cleanest and most _best practices_ rendition? The only one I know is the venerable HTMLtidy. Please see this for examples of best practices."} {"_id": "48698", "title": "User uses \"mark as spam\" to delete email", "text": "I want to ask, what would be the recommended procedure for dealing with users who used \"mark as spam\" button to just delete emails. Recently I have a user, who requested \"password reset\" email 3 times. He/she successfully changed the password, and then marked all three emails as spam. This dragged down my Sendgrid reputation down from 98% to 85%. This is pretty bad, since any reputation lower than 80% is subject to be reviewed. I am thinking of writing an email to that user, to ask him/her to \"unspam\" the email. What do web masters usually do in this situation? Is there anything to watch out for? **Update** After one week, another user marked password emails as spam again. This led me into thinking someone is attacking our email reputation."} {"_id": "4495", "title": "Does changing web hosting server affects SEO page ranking?", "text": "I have been hosting in a server from last 2 years. I am planning to change the hosting server. Before i change the hosting server i would like to know whether it affects my page ranking or not?"} {"_id": "67508", "title": "Transferred hosting and website not online during transition for 2 weeks - will this affect search results?", "text": "If a website's hosting is being transferred and is not online in between for approx. 14 days - would this affect the organic search positions when it's finally transferred? When they've since disappeared entirely from the search results? NOT A DUPLICATE."} {"_id": "45568", "title": "Will changing server location affect SEO?", "text": "I have a .com website hosted in the US. Most of my traffic comes from there, but I found better prices and service in a webhost with a server in another country. Will it matter if I make the switch, given that I have an international audience but traffic mostly coming from the US?"} {"_id": "38603", "title": "Why do 410 pages show as errors in Google Webmaster Tools?", "text": "To remove links from our site, we return a 410 code on on the links we want removed, and shows `The page you requested was removed.`. In Webmaster Tools, I see all the 410 pages in Crawl Errors / Not Found. I'm worried that because they appear in Crawl Errors that they could be negatively affecting SEO rankings. Is that the case, and if so, should I change the return codes from 410 to something else?"} {"_id": "49405", "title": "How do I set a subdomain like mail.v2techlabs.com where I can directly access my email", "text": "I am using iPage shared hosting. I want to access my emails at URLs like `mail.xyz.com` where I can directly login to my email. Right now it is going to the iPage website and logs in from there. Can you please let me know how to set this up."} {"_id": "38605", "title": "How to restore plesk files correctly?", "text": "Recently my server crashed. So I reinstalled Plesk and uploaded all databases to `var/lib/mysql` and site files to `backup/vhosts`. How do I get it all back to normal?"} {"_id": "38607", "title": "Forwarding non-www domain to other domain using DNS", "text": "Is it possible to forward `firstdomain.com` to `www.seconddomain.com` or `seconddomain.com` using purely DNS records? I know how to forward `www.firstdomain.com` to `seconddomain.com` (using a CNAME). What I am trying to do is move my site from one domain to another (new) one, and not break all the links that use the old domain name. I can't do a 301 redirect as it's hosted on GitHub Pages and I don't have access to the web server."} {"_id": "67100", "title": "Is there an alternative to using an iframe so that my content would be visible to search engines?", "text": "I have a book-length document that is typeset with the LaTex typesetting system (It's a markup language popular for science and math typesetting.) A custom script runs a program called HyperLatex that transforms the source for the book into both a PDF e-book, and a set of linked HTML pages. This generated site formats only the content, so for each chapter, I have a hand- written HTML shell page that holds all my branding boilerplate, navigation code, Google ads, etc... The script simple injects the HTML content for each chapter inline into the corresponding shell page. Voila, book and website are automatically in perfect sync with one command. The problem is this: AFAIK I've never had a web search hit! Apparently the iframe makes the content invisible to search engines. I'm not a Web programmer (obviously!) Is there some other kind of HTML container that would let me do this, but not render all of my content invisible to the search engines?"} {"_id": "60639", "title": "How does 2 domains on dedicated IP's sharing the same backend affect the ranking?", "text": "I have two Domains, both with dedicated IP's, with a service acting location based. The two domain-names have simple to remember name-pattern going like `##servicedescription##france.com` and `##servicedescription##germany.com` with the main keyword at the first position in the domain-names. Now, I could easily copy the CMS and servicesoftware on each hosting plan but I want the system to stay maintainable. Thus, I want both domains share the same backend with slightly different content. Both websites are displayed in english as the default language. Does this in any way affect the ranking and how can I optimize this constellation?"} {"_id": "57377", "title": "Fetch as google - getting both 301 and 404", "text": "I am getting this from fetch as Google: HTTP/1.1 301 Moved Server: nginx Date: Wed, 22 Jan 2014 23:07:58 GMT Content-Type: text/html; charset=utf-8 Content-Length: 20 Connection: keep-alive Status: 404 Not Found Location: http://www.peciatky.sk/page-not-found Vary: User-Agent,Accept-Encoding Content-Encoding: gzip Set-Cookie: 8812c36aa5ae336c2a77bf63211d899a=Bbvc3w0i7riSZmjnod4ylyvTAnxn7DCLHYhXe92zKrRSDQqNDueWOsutRqv%2FallRQe4HFrGbnpiOTy4Cy%2Beglw%3D%3D000060; expires=Tue, 11-Feb-2014 23:07:58 GMT; path=/; domain=peciatky.sk; httponly Set-Cookie: 8812c36aa5ae336c2a77bf63211d899a=Bbvc3w0i7riSZmjnod4ylyvTAnxn7DCLHYhXe92zKrSIJvpKfQ%2BsH0cil9dtFLSB7mx6NcgFC89BWp3h8IqeSQhy%2Btt5JlIMFe0f%2BdCAhpI%3D000075; expires=Tue, 11-Feb-2014 23:07:58 GMT; path=/; domain=peciatky.sk; httponly Which one is the main error that Google works with - the 301 or 404?"} {"_id": "33574", "title": "When entered into Google, what's the difference between \"link:www.mysite.com\" and \"links:www.mysite.com\"?", "text": "I've heard this is the way to measure inbound links to a website? What is the difference between the two and is there a better way besides these?"} {"_id": "57800", "title": "Tracking many subsites in Google Analytics", "text": "We have a website where visitors buy access to online courses. We'd like to have reports about visitors across the website/all online courses as well as reports of visitors on specific courses. The main website and the online courses all reside on the same domain with the courses being hosted in subfolders. The number of online courses already is around 50 and will grow, so having a dedicated view per online course is a bad idea because of the limit on the number of views per account. Would using page level custom variables identifying each page view to a course be enough to produce interesting reports? Is there a better strategy?"} {"_id": "56881", "title": "Google stating price is missing from product rich snippet, but it is present", "text": "I'm trying to figure out how to start adding rich snippets to products but I keep getting an error from Google stating: > In order to generate a preview with rich snippets, either price or review or > availability needs to be present. But I have price in there in a meta tag. I don't get what's wrong here:
  • \"A

    Depth

    Depth

  • "} {"_id": "13092", "title": "Should Marketing departments have basic HTML skills?", "text": "Working within an organisation as part of the in-house site development team, a lot of my team's throughput is driven by the colouring-in (marketing) department. It is their responsibility to provide approved content and imagery for the features or enhancements that we include on each iteration of the company site. One thing I've noticed in this job and several previous ones is that the Marketing department is extremely particular about wording and presentation, but has little to no understanding of the actual medium with which they're working - the web. I find that my team is constantly making best guesses for various HTML attributes like image alt text, titles, rel tags, blockquote cite attributes and the like. How reasonable is it to expect that marketing departments have a strong understanding of the purpose of HTML metadata? Should it be the developer's job to remind and inform each time or are marketing departments falling behind the technology they're working with? What could I reasonably expect our marketing department to understand and provide every time with each new work request?"} {"_id": "23404", "title": "Having google index canonicals but users using parameters - correct?", "text": "I'm working on a site that has a search facility with multiple parameters that look up property listings. The possible parameters are: City, Area, Building Type, Min. Bedrooms, Max Rental Price, Page Number, Sort Order. The 'raw' url, without any rewriting would look something like this: **www.mysite.com/city=1 &area=1&type=1&bedrooms=3&price=1000&page=3&sort=1** While you're using my site, it doesn't matter to me or to you what the URL looks like, so I think I'm happy to work with the so called 'dirty' URL. It matters however, what Googlebot sees, so i'm planning to add a URL rewrite to allow access to pages like: **www.mysite.com/london/kensington/apartments** And then i'm planning to add canonicals to make sure that's the page that gets indexed - no matter what your bedroom / price preferences are, what page of results you're on or the order in which you want them to appear. The idea is that Google will only index fewer, higher quality 'view-all' pages, but users will be able to drill down and refine their results to get very specific. The question however is whether or not this is a correct use of the canonical and whether it will lead to the desired effect? **EDIT** It doesn't matter if google indexes 'dirty' URLs with parameters (though it should index the clean one when theres one available). What really matters is that the site gets found when people conduct a relevant search. Having it above competitor sites is the idea, if they didn't have an SEO strategy."} {"_id": "67106", "title": "Responsive link units in AdSense", "text": "As of now, it appears that the only way to implement Google AdSense ad units for a website built on responsive design is to use the Google responsive ad units. However, it appears that there is no responsive option available for link units. Is there any way to implement Google AdSense linkunits on a responsive design without disrupting the design and without having content lie outside the viewport? I asked the same question at Google Product forums and they couldn't help either."} {"_id": "36118", "title": "How to integrate formstack, wordpress and authorize.net?", "text": "What are all the steps that involve in integrating formstack with Wordpress CMS? What are all the steps involved in integrating formstack with authorize.net?"} {"_id": "44759", "title": "Do too many redirects hurt the SEO structure of Wordpress blog?", "text": "I have a website http://www.voiceable.org/ and just moved a category posts to the subdomain http://tech-fun.voiceable.org/ Now, i am using redirection plugin for 301 redirection from that category to sub domain. The category has more then 500 posts, is it safe to use these no of 301 redirection from SEO point of view or will it have any impact on search engine ranking?"} {"_id": "44207", "title": "Bootstrap dynamic content", "text": "I am building a website powered by twitter's bootstrap. I have created a landing page with menus and text. The next thing I want to do is create a About page, but what is the easiest way to create a new page with the same layout. I don't think copy and pasting the layout would be a great idea. I have used PHP includes and $_GET variables for this in the past but I wondered if there is a more elegant solution. I hope this is the place for the question and I hope somebody can help me."} {"_id": "24038", "title": "Adaptive websites - considered keyword stuffing by Googlebot?", "text": "> **Possible Duplicate:** > Adaptive websites - considered keyword stuffing by Googlebot? I work on a SEO-dependent content website, and we are in the process of making our site adaptive/responsive (what are adaptive websites?). More technically, we are using CSS media queries to apply different CSS styles as the size of the browser changes. As the browser size shrinks to 320x480 and below (mobile), we use CSS to hide much of the content on the page - up to 75% of it. My question is - would Googlebot consider this practice keyword stuffing? My worry is that Googlebot may think we are stuffing the page with keywords to make the page relevant, but then hiding the keywords with CSS."} {"_id": "67104", "title": "Would my SEO be impacted if I migrated posts from Tumblr to a hosted WordPress solution?", "text": "I'm curious about both the positive and negative benefits of migrating old posts that were written on a Tumblr blog with a custom domain to a self-hosted WordPress solution. I've seen many people mention better SEO from a self-hosted WordPress solution, as well as the convenience of having all blog content in one place. If I were to migrate everything to the self-hosted WordPress, I would create `301` redirects from the old post URLs to the new post URLs in order to preserve SEO. However, are there downsides to this? Am I going to experience an SEO hit? URL example from my old Tumblr blog: `http://blog.example.com` URL example from my new self-hosted WordPress solution: `http://www.websiteurl.com/blog`"} {"_id": "44200", "title": "Web hosted payment form service recommendation, for card processing", "text": "I'm accepting credit cards on my website, using PayPal's \"hosted\" solution (an IFRAME that collects the credit card information, PayPal calls this \"PayFlow\"). I chose PayPal because they were a brand I recognized. However, PayPal has been absolutely dreadful to deal with, accidentally deleting my account overnight and whole comedy of errors, bad service, and hidden fees; so I'm looking for a replacement. I will eventually be taking recurring payments. These are for simple website service subscriptions, so it will be a custom \"shopping cart\". I don't want to touch the credit card numbers, to limit my risk as much as possible. Can anyone else recommend a major player that they trust to do this, please? I don't see where Google Checkout offers a hosted form, but I could be wrong. And if this is a bad place for this question, would you please give me a bit of advice on where to take it? There is supposed to be a FAQ at http://stackoverflow.com/questions/982953/what-is-the-best-credit-card- processing-service , but it is gone. edit: Based on one of the answers, I should mention this is for North American (almost exclusively US) business."} {"_id": "55537", "title": "Tracking content from 3rd party widget using Google Analytics", "text": "I have a situation where I generate content (a widget which is composed of simple divs specifically) and serve them to 3rd party sites both synchronously & asynchronously. While serving those widgets, I want to track the number of presentations and clicks from users using the widgets with Google Analytics. Is that possible? What is the best approach to face this tracking issue?"} {"_id": "68791", "title": "Issues overriding page field value in pageview tracking in Google Analytics", "text": "I decided to override page value which is send to GA on page view. Reasons for override were: * Not always page title is unique (some pages have one title) * Page URLs are ugly and meaning less and looking at URL it will be hard to understand which page it is. So I explicitly provide page value. Like this: ga('send', 'pageview', {'page': '/store/books/viewlist'}) My questions are: 1. Is this a good practice to override page value? 2. Why in Google Analytics _Behavior -> Events -> Pages_ in page column I see page URL (thus \"/store/driver?page=777&action=viewlist\") not the page value what is sent? Is this a GA bug? In other places, for instance, in _Behavior -> Site content -> All pages_ in page column I see correct page value (thus \"/store/books/viewlist\")."} {"_id": "68795", "title": "Is it possible to track URL parameter and then value of cookie?", "text": "I have recently launched a site for a customer which we would like to track initial visits on URL's like so via Google Analytics and then track their revists on their respective referral code: User comes in on: * `domain.com/?ref=code1` * `domain.com/?ref=code2` * `domain.com/?ref=code3` Once they land, we drop a cookie with the ref code and this determines what they see on revisiting the site. They are locked to this ref code unless they go through another affiliate website and come in on a new referral code. We would like to know which code they came in on, and then know which code is within the respective cookie? That data can then be drilled down into via filters or seperate profiles perhaps? I am not an SEO specialist, just a web developer trying to sort this out."} {"_id": "68799", "title": "Im getting the IP of my VPS instead of the domain in Google search results", "text": "Im getting the ip of my vps instead of the domain in web results. I don't have the same problem with other sites inside the vps. ServerName www.todocamino.info ServerAlias todocamino.info DocumentRoot /home/tirengarfio/workspace/todocamino/web DirectoryIndex app.php # Options Indexes FollowSymLinks MultiViews AllowOverride All #Order allow,deny #allow from all Require all granted FallbackResource /index.php # BEGIN EXPIRES ExpiresActive On ExpiresDefault \"access plus 10 days\" ExpiresByType text/css \"access plus 1 week\" ExpiresByType text/plain \"access plus 1 month\" ExpiresByType image/gif \"access plus 1 month\" ExpiresByType image/png \"access plus 1 month\" ExpiresByType image/jpeg \"access plus 1 month\" ExpiresByType application/x-javascript \"access plus 1 month\" ExpiresByType application/javascript \"access plus 1 week\" ExpiresByType application/x-icon \"access plus 1 year\" # END EXPIRES "} {"_id": "33541", "title": "Canonical URL for paged results pages", "text": "If you use a canonical url meta tag on a paged result page, will Google still link to it when a keyword is found? I have a forum which has multiple pages for each topic. 10 posts per page. When I google for a topic title, I see that page but also a link to the last page, and since the button said \"Last\", Google actually adds that to the link, making it look really weird because they remove the original title, which is also the topic title. Anyway, should I put canonical URL meta tags on pages that have \"/page/4/\" behind the URL? And if people search for content that can be found on page 4, will Google still point them to the right page? To make it more abstract: Having multiple links in search engines results is silly, but they should still be crawled as the content is not duplicate."} {"_id": "67361", "title": "Why are my posts omitted from Google search results when my site is listed in it?", "text": "My domain is not new at all, in fact I have waited for quite a long time now for Google to show some love. If I do a search for my domain name, it shows the website is listed, but if I try to search for even a year-old post on Google, using my website name too, it never returns it. Instead it says some results have been omitted and asks if those omitted results should be included in the search results. The following is the message it returns: > In order to show you the most relevant results, we have omitted some entries > very similar to the 8 already displayed. If you like, you can repeat the > search with the omitted results included. Why are my posts are never returned in search results even though I have original and ample content on my website?"} {"_id": "43573", "title": "Is this considered as keyword stuffing?", "text": "I have developed a page to define world time zone clocks online. At the bottom of the page, I list all 600+ time zones. Is this considered as keyword stuffing? Does this impact ranking negatively? Should I remove them? Thanks!"} {"_id": "59782", "title": "Trying to add share button for blog articles", "text": "I have added share buttons to the articles on this page: http://185.37.226.104/noticias I have used the code generated here. As you can see there are multiple articles on the same page, I have done it like that because the articles will be not too long. So I have to use this structure. The problem: when I press \"Like\" on one article, nothing happens, any help?"} {"_id": "68826", "title": "Tracking event completion based on page referral", "text": "I have a question about trying to get some detailed information around specific event completion. I'm wondering if it is possible within Google Analytics to track event completion within a specific Category but with different Actions? I have a group of pages (`A` which can be matched with a simple regex) and these page have linked content (with an event on it) that can funnel users to another page (`B`) where I have an action I want to track (form completion essentially). What is the best way to set up a goal or flow to see how many people from pages `A` complete the action on page `B`."} {"_id": "27828", "title": "Google analytics reporting link as campaign instead of referral", "text": "There are several websites which link to my blog posts but are using links from feedburner (similar too http://feedproxy.google.com/~r/AboutMyCode/~3/something ) so when someone visits these links it gets reported in Google Analytics as campaign. Is there any way to make these links show as Referral in traffic sources?"} {"_id": "48079", "title": "Removing Google Webmaster URL Parameters", "text": "Some time before I have created a post for removing full site from Google Index, after following many advices I have removed large number of pages from Google Index but now facing another problem as removing the Google Webmaster Tools URL Parameters. My website total index pages has come down from about 5,000,000 Pages to about 97,000 but the URL Parameter still shows the following things: parameter effect Crawl passed 1,274,056 Narrows No Url searchterm 1,269,622 Narrows No Url court 1,265,840 Narrows No Url sel 1,265,502 Narrows No Url page 1,187,018 Narrows No Url But the URL parameters are not decreasing. If I re-include the website in Google Index and pages Index Goes up smartly as the parameters are still in the Google Program. Another thing I want to know if the indexed pages are 97000 then how the parameters are 1,200,000, I didn't understand it how"} {"_id": "59786", "title": "How can I stop Googlebot can't access your site messages from a site that's moved to another developer", "text": "I had a client who moved their site a while back to another developer. When they moved it they took down the site and put up a coming soon page ( which is currently what's there). Googlebot warned me that it couldn't reach the site so I logged into My Webmaster tools and removed the site from it, but I continue to get messages saying it can't access the site. Is there some way I can communicate this scenario to Google. I really thought that removing it from my webmaster tools would have done the trick. I'd rather not put the google email address in my blacklist."} {"_id": "20299", "title": "Javascript - Detect if user is coming from a Google Adwords ad", "text": "Is it possible to detect if my visitors are coming to my website via a Google Adwords Ad? It would like to change the telephone number for those users only, that way I can measure the success of my adwords campaign with the number of calls."} {"_id": "48074", "title": "Time Update in HomePage Meta Description", "text": "every time I add a new post my blog homepage also shows the update time in Meta. Say whenever i type site:thetopblogger.com in Google search it shows update time in homepage Meta description - 1 hour ago, 4 hours ago, 15 hours ago, 3 days ago, 5 days ago which matches with the last post add/update. I check many blogs but no one is showing any time in homepage Meta. I do not want to show time in Homepage Meta description, as Google will think that I refresh/update my homepage meta frequently! Any idea how to solve this? Thank you."} {"_id": "20297", "title": "Where are the \u201cdollar index\u201d stats for my content in the new version (V5) of Google Analytics?", "text": "I found the \"$ index\" for each web page to be one of the most useful features in Google Analytics, but I can't use the new version (V5) that was rolled out a few months ago, because that statistic is now gone. Has it been removed? Why?"} {"_id": "20292", "title": "HTML Code Editor - conditional find & replace", "text": "The Dreamweaver find and replace tool was very handy. (I no longer use Dreamweaver.) You could perform conditional search and replace on tags and/or tag attributes, which was very effective for cleaning up (atrociously) bad Word HTML code, for instance. Do you know of an alternative to the HTML find & replace tool found in Dreamweaver?"} {"_id": "20293", "title": "International Search Engines", "text": "Forgive me for using this forum improperly if I am, but I love this site so much, I wanted to start here. I recently took a position that's going to have me working with international search engines, most notably Baidu in China. Does anyone have some tips or places to start looking on SEO techniques? Does their indexing/robots work along the same lines as Google/Bing/Yahoo! etc? Does anyone have experience with this? Again, I apologize if this is not the proper place, but there are a healthy amount of people in the same industry in this community so I wanted to try."} {"_id": "20291", "title": "How to improve my google adsense CTR?", "text": "My website over the past year has 6k ad impressions but the click through rate is 0.12%, which amounts to a mere 7 clicks. How can I increase the click through rate? (I'm looking for general strategies rather than specific instructions)"} {"_id": "29721", "title": "Debugging \"Flex error #1001\" in Google results", "text": "In the little summary extract which Google displays about pages from a site I administer, there's frequently a Flash error message: > ### Page Title > > www.example.com/.../Slug_of_the_page > Flex Error #1001: Digest mismatch with RSL > http://www.example.com/framework_3.2.0.3958.swf. Redeploy the matching RSL > or relink your application with the ... _I know people will want to tell me what the error message means and how to recompile, but there are plenty of instructions about that elsewhere, and that's not my question._ I see this happening in search summaries for pages on this site which I think don't _have_ any Flash objects in them - and certainly Firebug's Net tab doesn't show any. Google's cached version of the page doesn't include the error messages in the HTML or text-only versions. Trying things at random and waiting for Google to re-index is clearly not a sane strategy. Does anyone know a tool which reproduces as closely as possible Google's web page to text converter, or which in some way would allow me to reproduce the error message locally so that I can debug and test that I've fixed it?"} {"_id": "796", "title": "When should \"domain hacks\" like del.icio.us be used for most sites?", "text": "While it's fine for sites aimed at more advanced, I find a lot of non- technical people tend to just add www and .com to every domain, so for example, my dad would type that in as `www.del.icio.us.com`. Obviously, if you were designing a site for fishing tips, for example, you would obviously use a more traditional domain, while for your fancy web based photo editor or whatever, it'd be fine. But those are the extremes, so where do you draw the line over whether those sort of domains are suitable? (Yes, I know del.icio.us changed to delicious.com)"} {"_id": "9386", "title": "Is the timeago date format appropiate for a website?", "text": "We're building a website for a startup and we encourage using the \"timeago\" format for displaying dates (i.e. less than a minute ago, about 5 minutes ago, about a month ago, etc.) but the client argues that it's not used in the US, that people are just not used to it. I can make a list of hundreds of sites using it, but of course, I'm a geek. So in adition to the main question, what are the pros and cons of the \"timeago\" date format?"} {"_id": "57925", "title": "Google Analytics: Exclude all non-event unique visitors", "text": "I have a dashboard on which I want two widgets: * **Widget one** : A _statistics_ widget with the count of _unique visitors_ that did **not** fire _any event actions_. * **Widget two** : A _statistics_ widget with the count of _unique visitors_ that did fire _specific event actions_. Widget two was a piece of cake: * Statistics: `Unique visitors` * Filter: Only show `Event action`, with regular expression: `(action1|action2|action3)` But I can't seem to get widget one to work, I tried the same setip as two but as regex: `.+` and exclude instead of only show. This would seem to me like it would exclude all unique visitors with any event action? Instead it shows results in 0 visitors. **Edit** : Oh, and it is important that the widget actually exclude **any** event action, not just the action1, action2 and action3 from the other widget."} {"_id": "9380", "title": "Is it possible to get information about competitor traffic?", "text": "Is there a possibility to get competitor website traffic info?"} {"_id": "23509", "title": "Confused by Fetch as Googlebot behaviour", "text": "I have a site that I 'm trying to index and I am a bit confused by what I should expect to see in the result of the `Fetch as Googlebot` tool. The site uses hashbangs, so according to Google it _should_ be crawlable by substituting hashbangs with `?_escaped_fragment_`. However, the results of the fetch don't show the actual content. Is this normal?"} {"_id": "791", "title": "How many domains to split components across?", "text": "I understand how splitting components across domains can maximize parallel downloads, and enable you to have cookie-free static content domains, but since there's a time cost for each domain lookup what is the optimum number of domains to use?"} {"_id": "790", "title": "Far Future Expire Headers", "text": "Do you use these? If so, what strategies do you have for managing changes to the files? http://developer.yahoo.net/blog/archives/2007/05/high_performanc_2.html"} {"_id": "67147", "title": "Is there a use case for not employing Elastic Beanstalk?", "text": "Being fairly new to the AWS ecosystem, is there any reason I wouldn't want to use Elastic Beanstalk for my python+django+yeoman web app (still being prototyped, but want to be ready to scale quickly as soon as it's released)?"} {"_id": "67145", "title": "Crawl errors and duplicate URL", "text": "I am getting lots of crawl errors, Google is adding main domain to the URL: sitemapxml: http://www.somedomain.com/whateveryouwant.phpmonthly From Google Webmaster Tools: Not found: http://www.somedomain.com/whateveryouwant.php When clicking it shows: http://www.somedomain.com/www.somedomain.com/whateveryouwant.php As you can see the domain URL is duplicated. On every page on this site I am using a header and a footer, I have navigation links on these, and they are \"included\" on each page: The links on the header are absolute: Whatever Is this affecting crawling? It is happening almost on every page, any ideas?"} {"_id": "67144", "title": "Can we import and display Facebook comments on our social media site?", "text": "I'm implementing a startup social media site where users post product reviews and other opinions. One of the features is a Facebook Share function. The business owners are interested in using the Facebook Graph API to track posts which users share on Facebook. They would like to \"monitor\" comments and other activity that occurs on Facebook for our posts that our users have shared. Technically, my understanding is that this is possible. Are we allowed to import these comments for display on our site? We are looking for options that would let us integrate or \"redisplay\" comment activity that occurred on Facebook but about a post on our site. Would this be problematic from a legal standpoint? We would rather stay away from official Facebook \"comments plugins\" and other such tools - and retain full control over the way that such comments are displayed on our site."} {"_id": "9389", "title": "What is the difference between a Facebook Group and a Facebook Page", "text": "I created a Project Management Stack Exchange Facebook Page to help promote the new Project Management Stack Exchange Site. One of the users suggested a Facebook Group instead. I use Facebook, and I've searched the help pages, but it's not immediately clear to me what the advantages and disadvantages are. Can you provide a list of uses for Groups and uses for Facebook Pages and perhaps some links to resources that help differentiate one from the other?"} {"_id": "57928", "title": "Spam being sent from my domain", "text": "I have a suspicion that somebody is sending spam from my domain. Today I received an email from postmaster(`AT`)halliburton.com to htuwxjw(`AT`)example.com which said: > The following email was blocked because it contained an attached file type > that is prohibited by the Halliburton email gateway: > > Sender: htuwxjw(`AT`)example.com > Subject: ATTN: Important notification for a Visa / MasterCard holder! htuwxjw(`AT`)example.com does not exist. I have a catch-all email which receives any email sent to @example.com. Is there anything I can do to prevent this? I have this TXT record on example.com: (MX records are Google's) v=spf1 a mx ?all I'm worried that any emails I send will be blacklisted if this person keeps sending emails as if they're from my domain."} {"_id": "798", "title": "Do I have to pay for advertising to get my site known?", "text": "If your site isn't going to show up in the search engines until people start linking to you, and no one is going to link to you until they know your site exists ... which will be when the search engines start showing you... Is the only way to get your site known to pay someone to advertise your site? What are some other ways to get you site started up?"} {"_id": "65510", "title": "Problems with setting up email notifications in Redmine", "text": "I\u00b4ve installed a fresh Redmine 2.5.1 installation on a Ubuntu 12.04 server. Redmine is served by Apache (passenger). The Server can send emails by use of Postfix (tested with the _mail_ command from terminal). After reading the help document on Redmine's Email-Configuration, I\u00b4m confused where to put what though. My aim is to use the server's ability to send emails with Postfix and not to use an external SMTP like Gmail. Redmine is also giving me following error in email settings: ![enter image description here](http://i.stack.imgur.com/P3lSi.png) In `/etc/redmine/default` I have following files (no _configuration.yml_!): * database.yml * session.yml In `/usr/share/redmine` resides: * additional_environment.rb.example * environment.rb * locales * boot.rb * environments * routes.rb * configuration.yml.example * initializers * settings.yml Questions: 1. Which directory is responsible for Redmine's config? 2. Which file is respnsible for email settings? 3. What configuration is needed to use the server's Postfix?"} {"_id": "42757", "title": "How many websites use CMS", "text": "Is there any hard data about the percentage of sites that use some kind of a CMS, as opposed to pure hand-coded HTML/CSS? From my anecdotal experience it really seems that more and more sites use a CMS nowadays, that hand coded sites take too long to set up, require too much time to maintain, are too hard to update regularly, etc. etc. Pure HTML sites seem insufficient for any but the most basic of websites."} {"_id": "47012", "title": "How can I block a user agent from all sites on my server?", "text": "For the last few days, I've been suffering from what appears to be a (presumably inadvertent) DDOS attack. I've been getting so many requests from an agent identifying as \"Mozilla/4.0 (compatible; ICS)\" that apache eats through all the available memory. Consequently, I'd like to block all requests accompanied by this user agent, so I tried doing this in httpd.conf: SetEnvIfNoCase User-Agent \"Mozilla/4.0 (compatible; ICS)\" bad_user Deny from env=bad_user But when I restart apache it complains about using `deny` here. Without having to wrap it in a `location` or `directory` block, which would mean I'd have to add a new block for each site, is there any way I can deny access to the whole server? * * * UPDATE: The error I get > * Restarting web server apache2 > Syntax error on line 4 of /etc/apache2/httpd.conf: deny not allowed here > [fail] >"} {"_id": "68869", "title": "Redirect Everything After slash (/) to another directory", "text": "I'm sure this is an easy thing to do but I can't seem to find the answer! I'm trying to redirect `https://carddav.example.com/MYUSERNAME` to `https://carddav.example.com/remote.php/subdomain/addressbooks/MYUSERNAME/contacts` where MYUSERNAME can be anything. The current RewriteRule I have attempted looks like: RewriteRule ^/(.*)$ https://carddav.example.com/remote.php/carddav/addressbooks/$1/contacts/ [R=301]"} {"_id": "58402", "title": "How to add a remote file to .htaccess file for webcaching?", "text": "I have the following code in my _.htaccess_ file that caches several file extension types: ## EXPIRES CACHING ## ExpiresActive On ExpiresByType image/jpg \"access 1 year\" ExpiresByType image/jpeg \"access 1 year\" ExpiresByType image/gif \"access 1 year\" ExpiresByType image/png \"access 1 year\" ExpiresByType text/css \"access 1 month\" ExpiresByType text/html \"access 1 month\" ExpiresByType application/pdf \"access 1 month\" ExpiresByType text/x-javascript \"access 1 month\" ExpiresByType application/x-shockwave-flash \"access 1 month\" ExpiresByType image/x-icon \"access 1 year\" ExpiresDefault \"access 1 month\" ## EXPIRES CACHING ## I would like to add a remote file to the list `www.example.com/js/jqueryvalid.js`... Is this possible?"} {"_id": "26073", "title": "Can a mobile version of site be considered duplication by Google if the page is the same only the css is different?", "text": "Hearing about the problems with mobile browser detection I plan to add a button to my site which enables the users to switch manually between normal/mobile version of the site. The button sets a cookie and reloads the page and the PHP code on the server side seeing the cookie serves exactly the same page with only the CSS link pointing to the mobile stylesheet file. I heard Google is able to follow javascript code too, so it may \"push\" the button and sees the mobile version of the site. Wouldn't it confuse Google that it sees exactly the same page (same URL, same content) with only a the CSS link replaced? Wouldn't it consider it duplicate content?"} {"_id": "28041", "title": "Should I index my mobile duplicate of my desktop website on google?", "text": "> **Possible Duplicate:** > Can a mobile version of site be considered duplication by Google if the > page is the same only the css is different? I have a duplicate of http://he.thenamestork.com with the url http://he.thenamestork.com/mobile - all files are duplicated while the mobile version has a slightly different content. Notice that when I write 'mobile' I only mean regular HTML4 with smartphone friendly CSS. I have a series of redirects (using .htaccess) that allows smartphone users land directly on the mobile versions. But I wonder, shound I index the mobile version as well so those users will be able to get direct, faster links? And what is the proper way of doing that without causing problem in google search? I guess I'm asking if there's a way to get google display regular urls for desktop users and ../mobile/.. urls for smartphone users, and if it is smart SEOwise."} {"_id": "65290", "title": "Canonical url homepage", "text": "I have a mini-website that has one unique webpage that talks about a product. The current existing links for this website are: www.example.com www.example.com/keywords-important-for-webpage Both pages show the same content, only the second page has a better URL. I want Google to index and show my link with keywords in the search-results instead of the homepage. So I added a canonical ref. on the homepage that redirects to the other page with a better URL-structure. But now, Google has indexed the homepage but not the URL with the keywords. So exactly the opposite of what I wanted. Is it not possible to add a canonical URL to the homepage? If it is, what could be the problem?"} {"_id": "68861", "title": "How to track pixel density with Google Tag Manager", "text": "I originally had it setup like How to see pixel density (Retina) in Google Analytics? with Universal Analytics by setting custom demisions. However, we recently had to switch to tag manager. Currently a bit lost on how to replicate the same actions as before. Can anyone provide insight or a tutorial as how to set this in GTM?"} {"_id": "68829", "title": "How do I implement \"share on Twitter\" for photos on my website?", "text": "I have a \"share on Twitter\" button on my website that allows users to easily share a tweet with the web page. The page features a photo, and we would really like to have users tweet the photo itself out. Is it possible to build \"tweet this photo\" functionality from my website? If I fill the image URL into the tweet, Twitter shows a link to the image, but not the image itself. Users can upload photo to twitter that get shown fully in Twitter."} {"_id": "68863", "title": "DNS mistake. Google crawled the wrong server. What to do?", "text": "# Problem Exists Between Keyboard And Chair I have a couple of similar looking domains on Coudflare where I manage their DNS. An old forum and my new still MVP site. As many of you may have experienced, Cloudflare moves around the order of the domains on the websites list based on use, with the last one you edited always on top. Based on use my MVP site has been on top for a few months now, but recently I've been making some changes to the forums as well. Well what happens is that at one point I inadvertently rely on my spacial-memory to click on the edit button, and changed the IP of the wrong site! # The aftermath I managed to change the A record for the MVP site that is currently live to point to the forum's server. I didn't realize the mistake until I received an email from Webmaster tools telling me of all the errors on my new site so Google ended up crawling the forum which was now being served under my MVP's domain name. Google now thinks my new MVP domain name has a completely different structure. # What can be done? I have since fixed the DNS issue. But now I am wondering if I have permanently ruined my new site's Google ranking with this mistake. * What problems have I created for my new site as far as SEO and Google Ranking? * Can anything be done to curb/fix them?"} {"_id": "1130", "title": "Best alternative to Adsense for a small website?", "text": "The accepted answer to this question states that Adsense functions rather poorly. What is the alternative then? My site has about 400 daily visitors, 1.5 pages/visit - so I'm too small to manually approach any sponsor for an personal ad deal. What AdSense would work best? By 'best' I mostly mean \"make the most money\", although usability is also a factor."} {"_id": "68865", "title": "Any tool to change words position in a text?", "text": "I'm looking for a tool that can change words position in a text. It's something like an article spinner, but without changing the words to synonymous. An example: Original text: I love eating apples and drinking orange juice, it's delicious. * * * Shuffled text: It's delicious, I love drinking orange juice and eating apples."} {"_id": "68864", "title": "List all domain names that were created on a specific date", "text": "Is it possible to list all domain names that were created on a specific date? I have the \"create date\" field and the registrar. Edit: Why my question was migrated to \"Webmasters\"? This is totally unrelated to \"Webmasters\"..."} {"_id": "23502", "title": "webhosting with IP access", "text": "I'm using a shared webhosting service but I can't access my website using http://ip.ad.dre.sse of the website, I have activated a dedicated IP service but the problem has not been solved. What do you suggest to solve this problem? I need to have IP based access because I'm using a GPS that could be configured only by the server IP (not URL). Any advice will be very helpful. ** edit ** Thanks a lot for your answers, my hosting provider has solved the problem of ip based access. Regarding the dedicated ip service - it is not free, it costs 5 euros, and it could be activated with free account."} {"_id": "68830", "title": "How will search engine find out if content of dynamic sites are changing?", "text": "I have site with few hundred blog posts and few thousand dynamic pages. If I create a sitemap.xml for dynamic pages , google does not crawl and index it. reason - they are not linked to each other and from other pages. The data on each dynamic page gets updated every few weeks. If I create link from home page and then link these pages from inner pages, search engine can crawl them. How will the search engine figure out if the content has changed after few weeks in absence of sitemap.xml for these dynamic pages?. I use sitemap.xml for blog content but not for dynamic pages."} {"_id": "34418", "title": "Geographic location settings", "text": "I am building an website. It has an .nl domain. Now only my domain is showing up on google.nl I hope I can change this somehow that it could be findable in all google's (like google.com / co.uk) and so on. If I look on google forums. They say go to webmaster tools and change your geographic position over there. But I have added this site and I am not able to change it there because there is no select box. I dont have any idea were to search (yes I searched on google offcourse) or where to ask for this special problem. So maybe here can someone redirect me or explain me what is possible and what not. The question is can I make an .nl domain findable in (almost) all google search sites? And so on how can I do that. Picture of my google webmaster tools (nl): http://i.stack.imgur.com/ZuP4L.png"} {"_id": "42976", "title": "Are there other mailloop like software?", "text": "I used to use mailloop to do mass mailing and I am happy. Now it's no longer supported. What other program we can use to send mass mail? I also need autoresponder, series autoresponder (people sign up and get 5 email 1 day each). Any other thing like that? Basically I need software that: 1. Capture user's email and their first name 2. Send mail merge to those users 3. Support auto responder Should I just use mail chimp service?"} {"_id": "43725", "title": "One domain and multiple website in folders", "text": "I am going to create a network with one domain, e.g. `example.com` then going to manage my websites in **folders**. Look below for example: www.example.com/market www.example.com/freebies www.example.com/personalblog www.example.com/shop Consider that all four websites have different design and codes. From SEO perspective, is it recommended or I should use **subdomains** or buy **four domains** for each website?"} {"_id": "21402", "title": "Stop directory listing using .htaccess and redirect to good error page", "text": "I have a directory with several sub directories. I found this article Preventing Directory Listing that shows me how to prevent people from getting the directory listings. IndexIgnore * But it produces a bad error page, (it looks like an empty directory listing). Instead I would like to redirect the user to a page or show a better error page when they try to view the directory listings of a sub directory. I am using Apache"} {"_id": "15130", "title": "When I am root, \"mysql\" connects without a password, even though I've set one", "text": "When I am root, \"mysql\" connects without a password, even though I've set one: # mysqladmin -u root password 'whatever' # mysql -u root -p Enter password: (typing the 'whatever' above) Welcome to the MySQL monitor. Commands end with ; or \\g. Your MySQL connection id is 4 to server version: 4.1.22-standard Type 'help;' or '\\h' for help. Type '\\c' to clear the buffer. mysql> Bye but unfortunately this also happens... # mysql -u root Welcome to the MySQL monitor. Commands end with ; or \\g. Your MySQL connection id is 25 to server version: 4.1.22-standard Type 'help;' or '\\h' for help. Type '\\c' to clear the buffer. mysql> So even though I've set the password, and it IS checked when I use \"-p\", it is however not necessary!?!?"} {"_id": "34031", "title": "Will not supporting IE or older browsers drive away potential visitors/users of my site?", "text": "Normally a SO browser but this question doesn't fit there, hopefully it fits here. I just want to ask from web designers' point of view if it's wrong to not care about supporting Internet Explorer or older browsers. The site I'm designing looks great in all browsers except IE9-. There are certain things that IE doesn't support or behave like other browsers; webkit stuff, some CSS styles, drop-and-drop files from OS etc etc, but it all works great in Safari, FireFox, Chrome etc. Should I be that concerned? I know there are several people that use IE, but it's limitations have just been causing me more work by having to come up with workarounds. From what I've read, many of the issues I've been having should be solved with IE10, but not everybody keeps up to date. I know of several people who are still using IE6! Again, I'm hoping this is the right place to ask a question like this, and if not, please point me to the right stack exchange site instead of just downvoting me. Thanks! EDIT: Upon further research.... So far this year, IE(all versions) and Chrome have been neck and neck as the top, with IE only squeaking by Chrome, and FireFox a close 3rd. But looking at the top 10 browsers, IE6 doesn't even show up on this list in which the lowest percentage is 1.92%. Source : http://www.w3counter.com/globalstats.php?year=2012&month=7 Having a look at this other site, IE6 shows up in 11th place out of 12, just before \"Other\" http://www.sitepoint.com/browser-trends-february-2012/ This makes me a little more wary of not spending more time on IE compatibility. However, my site will not be going to a live beta until October or November, and I'm hoping that IE10 will have more features coded into it. Currently, I've written my upload page which is a \"drag-and-drop files from the OS\" type to simply display \"IE is not supported\", leaving no other option for IE users to upload pictures because I've spent so much time writing the uploader which does many things other than just upload the files. I will be changing this kinda cold \"Access Denied\" to a suggestion to upgrade, or install other browsers, with download links for each. Big thanks for the posts here and the interesting links!"} {"_id": "59563", "title": "Google analytics shows wrong number of page views, asp.net website", "text": "Sometimes it can be for example 4500 requests, after a few hours it shows a few thousand less. What is wrong? It looks like analytics corrects itself. I changed from classic to Universal a few months ago, do not know if it has anything to do with this. In masterpage: "} {"_id": "26992", "title": "If I have only WOFF and EOT, what browsers am I supporting with @font-face?", "text": "We're looking at buying a font which only allows its use on the web in the provided formats: WOFF and EOT. I'm not sure what browsers those work in and can't seem to find up to date information. What browsers can I suport with just those two? My only experience is with fontspring's syntax which has EOT, WOFF, TTF and SVG."} {"_id": "31766", "title": "Worth changing the URL structure to incorporate keywords?", "text": "I am migrating my blog from PHP to ASP.NET and while recoding the whole website, I figured I might as well improve the URL structure. This is how an url looks like now: **example.com/blog/post/755/hakurei-reimu-cosplay-from-touhou-by-kishigami- hana** and this is hould it will look after the change (cosplay being the dynamic main keyword of the post): **example.com/blog/cosplay/hakurei-reimu-cosplay-from-touhou-by-kishigami- hana-755/** The website is a bit more than a half year old and receives around 650k page views a month, mainly from search traffic. Of course everything would be redirected with 301 redirects. Do you think it is worth changing to a new URL structure, or will it harm the ranking in the long run?"} {"_id": "26990", "title": "Legal information on a website", "text": "I'm in the process of re-developing my website. We're currently a non-monetary website, funded entirely by ourselves, but we'd like to start earning some money from adverts and possibly paid content. We are a UK-based fishkeeping website (although we've had visitors from every single country in the world if you believe our Google Analytics!) with tropical fish species profiles, a Question & Answers section (similar to SE) and a discussion forum. The species profiles are written by our own author, appropriately referenced to scientific journals etc (though it's all our own prose). We also use images from photographers with their express permission and as such copyright all of our images with the copyright symbol and their name or chosen brand. **What I'd like to know is this** : In terms of privacy and legal \"blurb\", what information should we be displaying to our users? Do we need to protect ourselves in any way? Users will be able to register an account then post comments, questions, answers and topics. What information do we need to display in terms of Copyright? Thanks in advance,"} {"_id": "7148", "title": "SMS ad service for a PHP app", "text": "I am about to launch a website that allows the end user to get alerted as their bus approches the stop. The only problem was this is a little expensive as I'm paying about 5 cents per alert. I was thinking that if I added a short message something like \"brought to you by acme co\" at the end of the message I may be able to recoup cost without making my service a paid one. Anyone know of any SMS ad companies or how one would go about finding one. Google has failed me, probably searching for the wrong thing."} {"_id": "7149", "title": "Need common platform: Wiki, Article, Forum, Quiz, News & a global dashboard", "text": "I am looking for a application to install in my server having the following features:' 1. A global dashboard 2. Wiki 3. Article 4. Forum 5. Quiz 6. Newsleter (Optional) 7. Q/A (Optional) I already found a application: TWiki, but looking for a better alternative."} {"_id": "35786", "title": "Is google pays for each time the user navigate and new ads appear?", "text": "If I have mobile app and google ads in each page. (according to pay per view method) Is google pays for each time the user navigate and new ads appear? Thanks."} {"_id": "26994", "title": "Why is this GIF's animation speed different in Firefox vs. IE?", "text": "Oracle Enterprise Manager has a web interface that uses this GIF: ![Oracle Enterprise Manager](http://i.stack.imgur.com/gYwDk.gif) The odd thing about this GIF is that in Firefox (v9&10) it spins about twice as fast as in MSIE (v7&9). **Why does the animation speed change depending on the browser?**"} {"_id": "7144", "title": "Intentional placing of text in clipboard", "text": "By intentional I mean the user is actively clicking a button or performing an action that leads to certain text being places in their clipboard. As far as I know the only solution to this is Zero Clipboard, but I feel like it's not accepted as a good solution since it uses flash. Are there any alternatives if I want to allow users to copy specific text information at the click of a button?"} {"_id": "7147", "title": "How to rotate html5 canvas as page background?", "text": "I want to achieve the following: Image a white sheet of paper on a black desk. Then rotate the paper a little bit to the left (like, 25 degrees). Now you still have the black desk, and a rotated white box on it. In this rotated white box I want to place non-rotated normal html content like text, tables, div's etc. I already have a problem at the very first step: rotating a rectangle. This is my code so far: With this, I see only a normal black box. Nothing else. I assume there should be a red, rotated box too, but there's nothing. What is the best approach to reach this and to have it as a (scaling) background for my web page?"} {"_id": "39556", "title": "How to retrieve an image from MySQL database and display the image as background on body?", "text": "How do I retrieve an image from MySQL database (I know how to display image by retrieving from database but here i want to display differently) i.e. display the image as background on body and image should repeat as if css repeat statement is used - the very same as on twitter.com. I want to display an image on background (by using CSS repeat thing) the same way as on twitter but the problem is that I am retrieving it from a MySQL database and CSS cannot handle src tag on background. I am very new to this and have been trying to solve this problem for 6 hrs but still it is not sorted out so please help. My code is PHP and MySQL (please tell me how to get URL for this image from the database and if I have to use CSS repeat)?"} {"_id": "18747", "title": "How to get Google to crawl AJAX pages without using #! URLs?", "text": "In Google's advice on \"Making AJAX Applications Crawlable\", they advise making AJAX URLs that aren't using hash-bang fragments (#!) crawlable by adding `` to the page ``. Has anyone had success with this? I can't find Googlebot taking HTML snapshots of the pages when using 'Fetch as Googlebot' in Webmaster Tools."} {"_id": "63346", "title": "Which special characters, if used in tags, will be visible in Google SERPs?", "text": "Seems like Google doesn't display some special characters if they are used in `<title>`. What are the characters that will be visible in SERPs for site titles?"} {"_id": "37647", "title": "Web Development Environment: How to distribute edited hosts files over bunch of mac machines?", "text": "I am doing some research to prepare some web development environment for our small(10ppl and growing) new office. User Case: For each new web project usually we create new alias on an Apache server someproject.companywebsite From my understanding in order to see this website locally for all the rest of our team(including mangers and directors) they will need to edit hosts file (e.g. \"192.168.1.10 someproject.companywebsite\"), and like that each time for a new project(can be 2-5 each week) Solution: And I looking for a solution how to edit this hosts file only once and distribute it over all mac machines in our network at once or much more flawlessly than poking around with each machine every time over and over again. Is that possible? Or that a very wrong way of doing that? Perhaps we better set up own local dns server and point to it our router? Though own dns server a bit concerns me because of might be some network interruption and others lags, if you know what I mean. Or perhaps there are another workflows for that? What's the best way for such things? So I'll be so grateful to hear some advices from experienced admins. I couldn't find that info on internet, so if you know where to read about it, point me in a right direction. Thank you in advance Alex"} {"_id": "18744", "title": "Can randomly-generated field names help avoid SPAM bots?", "text": "If I were to generate my field names randomly each time a contact form was rendered, would that help prevent SPAM bots from recognizing the fields? Can they read label text?"} {"_id": "18743", "title": "Is the idea of user registration flawed if we can't provide any real value for it?", "text": "I work in a newspaper publishing company and currently, all the contents in our site is free and no user registration is required. Recently, the lifestyle column team proposed to implement a small and simple portal where viewers can make simple suggestions as to where to dine. The team then suggested to implement a simple user registration; asking for name, email and password. Or if they do not want to register for an account; they can login with their Facebook account. This is so that they could keep track of who suggested what. However, I think this is counter-productive as we don't really provide any value in return. As the team said, they just want to keep it really simple. There will be no user control panel and they won't be doing anything with the user data such as sending promotional newsletter etc. As such is it really feasible to implement a user registration in situations like these?"} {"_id": "37640", "title": "Transition to new site", "text": "I'm almost finished rewriting the website for a non-profit organization. The existing site receives ~5,000 a month. The new site is being written in ASP.Net and the existing site is PHP. The current hosting provider does not support .Net hosting, so I'll be switching providers. My question revolves around the transition from the old site to the new. I would really like to get the new site up at the new hosting provider and do thorough testing before changing the DNS records for the domain. **Question:** How can I put the new site up, test it, make any changes/additions necessary before updating the domain DNS to point to the new IP **without** Google indexing the content? Also, what SEO repercussions should I be aware of when making such a drastic change to the content that exists under the domain name?"} {"_id": "18741", "title": "Blogspot as a simple CMS", "text": "Blogger/Blogspot recently released a new version of their software. This new version appears to have features relevant to a simple CMS (static page, albeit limited). I read from their Buzz Blog about a few websites that don't necessarily look like a typical Blogspot blog but rather somewhat a typical website deployed using a minimal CMS software: http://buzz.blogger.com/2011/07/you-can-do-some-amazing-things-with.html Can anyone point resources where I can learn how to do these? (Preferably case-studies with some steps how to create such website as oppose to Blogger HOWTO). Plus point if you can also tell me the infrastructure of Blogger.com (software stack, etc). Thanks"} {"_id": "55034", "title": "Tracking state of a one time event on a big website", "text": "Assume a website with 250 million active users. I add a new feature to the website. Once a user visits I want to use a short tutorial to teach them how to use said feature. I only want them to complete the tutorial once (or actively click it away). What is the smart way to code the verification check for this? How do I track the progress in the database? Having a separate table with like NewTutorial_completed = 1 for user_id = 21312315 would just snowball. It also feels intuitively bad to check for every one-time event for every user on every page view. While writing the question I got one idea, to have a separate event log that is checked periodically for any new action the user need to see or perform. I push events to this log and once they are completed they are removed from the log. No need to store NewTutorial_completed = 1-type variables this way. I am sure this is a common problem. I would appreciate any input on what best practice is."} {"_id": "33250", "title": "Impossible referrers", "text": "I'm tracking webpage referrers. Sometimes I see russian (or other languages) blogs as referrer, which cover other subjects. Googling \"site:http://myrussianblog.ru mywebsite\" returns nothing. I don't use advertising systems. What these referrer mean? I know that referrer field can be changed.. but why?"} {"_id": "12877", "title": "Make all subdomains load the same page, how is it called/done?", "text": "Is there a way to make all subdomains show the same page. I got the idea somewhere you could have a page like idk information.com and if an user typed miami.information.com he would get information from that city. Is there a way to do it? Like making the site retrieve the first part of the subdomain and search it on the site."} {"_id": "12876", "title": "Google Instant Preview (Image) - What screen dimension? e.g 1024x768?", "text": "I'm wondering what screen dimensions Google Instant Preview takes a snapshot of your website at e.g 1024px x 768px...? http://sites.google.com/site/webmasterhelpforum/en/faq-instant-previews FYI: I'm building a \"responsive design\" layout. Wonder if others like Yahoo or Bing take screenshots? And what dimensions they too use..."} {"_id": "8051", "title": "Server-infrastructure recommendations", "text": "Here's the thing: I need a cheap, fast, reliable infrastructure that can dynamically scale (like Amazon S3: cloud-storage). I'm thinking of 3 different type of 'servers'. 1. Application-server * Should be able to run CentOS (or another light Linux-distr.) * Should be able to run Apache * Should be able to run PHP * Should be able to run GD (so it does rely on it's cpu). * Should be _extremely_ reliable and fast. 2. Database-server * Should be able to run MySQL * Should be able to... well, do nothing else :P. * Should be _extremely_ reliable and fast. 3. Storage-server * Should be able to run some kind of file-transfer-deamon (like FTP, CouchDB, etc.) * Should be able to do nothing else. * Should be _extremely_ reliable and fast. So technically, by transferring all static data to 2 different servers/services, the application-server can totally focus on the webpages. My questions: * What services do you recommend? * Which is cheaper, faster and more reliable: using my own server, or using some cloud-storage/cloud-computing-service (like Amazon S3, CloudFiles, etc.)? * How can I prevent bandwidth abuse (such as dos-attacks causing the bill to be extremely high)? * What's the difference between \"including CDN\" and \"excluding CDN\"? It seems the price doesn't differ at CloudFiles? * Do you have to pay \"including CDN\" + \"excluding CDN\" when you decide to enable the delivery-network? Or have you only got to pay \"including CDN\"? * Should I use my own nameserver too or can I use my domain-hoster's nameservers? What are the minimum software specifications of a nameserver. Can I write some software myself? Does anyone have a good protocol-description? I hope you can answer my questions. **Answers** * I shouldn't write my own nameserver-software. Instead, I should use something like bind. (http://osspro.com/2010/05/04/linux-create-your-own-domain-name-server-dns/)."} {"_id": "8050", "title": "Google maps for a mobile site", "text": "We have a website that uses the Google Maps static maps functionality to embed maps into pages. This works with no problem; we just use something like this: We are now creating a mobile site for the same client. The code itself will still work, but I'm wondering how to make sure I get a map that doesn't take over the entire screen, but is still readable. Is there a standard size that's used for mobile screens? I'm guessing with the static maps there's no way for a user to click to zoom in if we use a small size, but the regular API requires javascript and a lot of mobile browsers still don't support it. Any ideas?"} {"_id": "8505", "title": "How long can I expect to wait before a DNS name is transferred to a new server?", "text": "I need to know how long it should take to transfer my domain name from one server to another. I have a domain name and I am trying to transfer it but one of the local companies said that I can't transfer in less than 2 months with them."} {"_id": "34043", "title": "get mysql_real_escape is giving me errors when I try and add security to my website", "text": "I tried doing this: @ $db = new myConnectDB(); $beerName = mysql_real_escape_string($beerName); $beerID = mysql_real_escape_string($beerID); $brewery = mysql_real_escape_string($brewery); $style = mysql_real_escape_string($style); $userID = mysql_real_escape_string($userID); $abv = mysql_real_escape_string($abv); $ibu = mysql_real_escape_string($ibu); $breweryID = mysql_real_escape_string($breweryID); $icon = mysql_real_escape_string($icon); I get this error: Warning: mysql_real_escape_string() [function.mysql-real-escape-string]: Access denied for user"} {"_id": "12871", "title": "Major SEO Pitfalls of JavaScript loaded CSS?", "text": "Looking to start with a basic.css and have JavaScript load enhanced.css when JavaScript is ebabled. What major negatives will this have for Search Engines indexing and ranking my site?"} {"_id": "12870", "title": "godaddy domains, no bids, should I wait for expire?", "text": "I'm interested in a godaddy domain name that's on auction. The domain name has no bids on it. Should I wait for the auction to end and then buy it afterwards or should I bid in the auction? Basically the question is, will the domain still be available to buy after the auction ends?"} {"_id": "34519", "title": "Incorrect Google Profile Results", "text": "Where do Google gets the info they display along with your profile in their \"Profile Results\" that is shown once you hit search for a particular person? When Google shows incorrect info, there's a feedback option at the bottom and it gives you an option to report a problem on which info is wrong. However, this is a tedious task since I've already done this before and they have corrected it but then when I search again after a couple of months, wrong info is shown again. So maybe I need to correct or do something on the process of putting those info into the profile... Not sure where Google gets all these info.. Help?"} {"_id": "8059", "title": "Where is a good place for non-technical people to register domain names?", "text": "I run a couple of websites and I have several domain names. I struggle to recommend a place for non-technical \"civilians\" to register a domain name though. The first service I ever used was GoDaddy, which seems to be focused on less technical customers, but the experience is just so terrible. The sales process goes all out to load up customers with a dozen things they don't need. These days I just use one my current hosting providers. So as webmasters, where do you send your non-technical friends to register domain names? Is there a nice, simple, friendly service out there?"} {"_id": "8058", "title": "Registering .ne for non-residents", "text": "AFAIK it's hard to register one if you aren't resident. Are there registers which can pre-register for me .ne (like NetworkSolutions does for .eu)?"} {"_id": "34048", "title": "How to crawl a webPage with dynamic content added by javascript", "text": "I guess there is a news that Google bots have the capability to understand our javascript code. It means this is possible to fully crawl a webpage which has lazy loading feature enabled. I am using Apache Nutch to crawl websites but I don't think it has the capability to fetch the URLs being injected in HTML page by javascript when the page is scrolled down. I see a lot of websites doing lazy loading for performance issue. So Can somebody please explain me how can i crawl the data which comes in HTML page on lazy load. (On scrolling the page down)."} {"_id": "60450", "title": "Images on my site contain wp-content/uploads/ in the URL, how do I change this?", "text": "When ever you click on an image in my site it displays www.sitename.com/wp- content/uploads/date/image.jpg. How do I change this to something like www.sitename.com/image.jpg or www.sitename.com/post/image.jpg?"} {"_id": "48450", "title": "Installshield in VS2012 and deployment of web application in IIS7.5", "text": "I deployed a web application in IIS 7.5 using _installshield_. But after deploying the site I am facing below error on accessing the site from URL. HTTP Error 403.14 - Forbidden The Web server is configured to not list the contents of this directory. Possible causes: A default document is not configured for the requested URL, and directory browsing is not enabled on the server. I also specified default document as one of `.cshtml` file in my project and make directory browsing enable. But still I am facing same error."} {"_id": "68023", "title": "Can we preserve our old domains Pagerank but cut all ties with the old domain such that it doesn't redirect?", "text": "Here's the situation - a client of mine has recently changed their company name and have changed their web address. They want to completely cut all ties with their old brand, so they have told me that their old website absolutely _cannot_ redirect to the new one. The old domain has a PR of 4, so it would be nice to carry some of that authority over to the new domain. The new domain has just been registered so I'm really starting over from scratch in terms of SEO. I can't use 301 redirects and I can't link to the new site from the old one. Is there anything I can do?"} {"_id": "31493", "title": "Multiple 301 redirect and massive loss of ranking", "text": "I just remade from scratch a website for a client, the client ask me to preserve their ranking by making 301 redirect from the original URL to the new URL. For example the following URL: `plumber-directory.my-website.com/john-smith-city-1.php` became `directory.my-website.com/plumber/city/john-smith.html` So i put the website online for few days until the 301 partially kicks in the Google results. Then the client call me back to tell me that his boss want to switch back to the ancients URLs >_< So I put a new 301 redirect: `directory.my-website.com/plumber/city/john-smith.html` Reverted to `plumber-directory.my-website.com/john-smith-city-1.php` Because google had just few days to assimilate the new URLs, it have now the two kinds of URLs in it's own result pages. Also the ranking of the website keeps falling down every day, i suspect google to mistaking those redirects for duplicate content. Is there something i can do to avoid a total loss of rankings?"} {"_id": "7498", "title": "How do I prevent spam in a vBulletin forum?", "text": "I'm using vBulletin 4.0.8 and I'm receiving spams every day. I used reCaptcha method to avoid spamming but the problem still persists. What should I do?"} {"_id": "2887", "title": "URI design for many-to-many relationship: advice needed", "text": "I'm stuck designing URIs for a many-to-many relationship between users and other resources. Specifically I'm having difficulty predicting the consequences of different approaches. Given a resources types `user` and `widget` (`user` represents a registered user), authenticated (logged-in) users have write access to all widgets and guests have read access. The user-widget relation also has data (eg. number of edits) which ideally should also be readable (thus _addressable_ ) by guests (and, as L\u00e8se points out below, other users). Examples use integers for unique identifiers for the sake of clarity. Guests can view widget #12 at /widgets/12 What are the implications of various possible URIs for authenticated user #34 to read/write widget #12, and optionally for a guest to read the relational data between the two? (Don't worry about giving a comprehensive answer, any observations for or against specific options would also be very valuable). /widgets/12 (user is implicit based on authentication, no way for guest to specify user- widget relation) /users/34/widgets/12 (both sides of the relation explicit, implicit hierarchy between resources) /users/34/widgets/12 and /widgets/12/users/34 (same, without hierarchy) /user-widget/34,12 (relation as resource, explicit but possibly more than the user needs to know) * * * I'm happy to give more detail or add other people's URI suggestions. Just comment if you want me to add anything."} {"_id": "55309", "title": "Rewrite rule for friendly URLs to index.php is not working from htaccess", "text": "Trying to get SEO Friendly URLs to work for my website using the following htaccess: RewriteBase / RewriteCond %{REQUEST_FILENAME} !-l RewriteCond %{REQUEST_FILENAME} !-f RewriteCond %{REQUEST_FILENAME} !-d RewriteRule ^explore/(.*)$ index.php?page=explore&type=$1 RewriteRule ^(.*)$ index.php?page=$1 E.g. it should load `/index.php?page=test` for `/test/` and `/index.php?page=explore&type=1` for `/explore/1`. However it's not working, does anyone know how I could get this to work?"} {"_id": "26661", "title": "Option to save project files for later use in Dreamweaver?", "text": "Does anyone know of an extension or other way to allow me to save a set of files in a project for later use? Example: - Working on site A, opened html files A1-A15 (15 files) * Received a request to work on site B, new files (number unimportant). * I would like DW to remember that I was working on files A1-A15. * Close the site A files and focus on just files from site B. * Complete site B work. * Reopen site A files altogether. Suggestions are greatly appreciated. Thanks!"} {"_id": "44970", "title": "Our \"Contact Us\" page is outperforming our \"Home\" page on Google", "text": "We're looking at how our website is performing in Google and noticing that our \"contact us\" page is outperforming our \"homepage\" for a typical search which is product+geography. widgets dallas widget makers dallas widget shop texas The reason being that the \"contact us\" page has the \"dallas\" & \"texas\" in our address more prominently in its text. So Google is being perfectly logical in showing this page as a priority. But the \"contact us\" page is far from the best page for our new customer to stumble on. It's not a _bad_ page, it's just not the best page for a new visitor to come to _first_. Our homepage is where to find our best stuff. Is there a correct solution to this problem? The obvious solution of stuffing more geographic keywords into our homepage feels a bit dirty (and anti- Google). Equally, making the \"contact us\" page more like our homepage is nonsensical. We don't want to detract from its job of providing \"contact us\" information to people who really need it, a job which it's currently doing fine."} {"_id": "31498", "title": "Why do I get this error about file not being found?", "text": "I am getting this error: `File does not exist: /var/www/user/home/user/public_html/cgi-sys` I have only .html files in the document root. Can anyone tell why I am getting this error?"} {"_id": "31499", "title": "Does URL encoding create duplicate content?", "text": "An SEO expert was testing my site, and noticed that my URLs contained the special character `:`. He said that would create duplicate content, because google would interpret any url containing `:` as two separate URLs: one with `:` and one with `%3A`. Is he right?"} {"_id": "7496", "title": "Is there a search engine that indexes source code of a web-page?", "text": "I need to search the web for sites that are in our industry that use the same Adwords management company, to ensure that the said company is not violating our contract, as they have been accused of doing. They use a tracking code in the template of every page which has a certain domain in the URL, and I'm wondering if it's possible \"Google\" the source code using some bot that crawls the code rather than the content? For example, I bought an unlimited license for an image gallery, and I was asked to type the license number in a comment just before the script. I thought it was just so a human could look at the source and find out if someone paid, but it turned out that it was actually that they had a crawler looking for their source code and that comment. If it ran across the code on your site, it would look for the comment, and if it found one, it would check to see if it was an existing one. If not, it would first notify you of your noncompliance, and then notify the owner of the script. **Edit:** I'm looking to index HTML and JavaScript only, not the server-side languages or Java."} {"_id": "50320", "title": "Does content with few words and lots of Anchor text count as spam?", "text": "My website has been getting inbound links from an article which contains a low word count but lots of anchor text. That paragraph contains 150 words but nearly 15 are anchor text. Will that have an affect my site's ranking?"} {"_id": "44976", "title": "How necessary is it to use HTML5 now on websites?", "text": "I have been working upon a wordpress theme and almost done with it, but recently I got to learn some of the new HTML5 tags such as `
    `, `