With rows of employees tapping away quietly at computers, this could be a call centre or tax office.
Apart from a discreet, hand-painted sign on the wall, there's little to suggest that it's the home of the world's most popular social networking site.
But here, on the Dublin waterfront, next to U2's recording studio, is Facebook's European headquarters.
Far from working on boring spreadsheets, staff are ploughing through reports of pornography, hate-speech and bullying.
If one of Facebook's 300 million users feels the need to report abuse, there's a good chance it will pop up on a screen here.
With Facebook's many features, there are a number of different ways bullying can take place on the site.
"Groups are the most public thing we see, with individuals making groups about each other in a manner meant to be abusive," says Marina, who leads the user operations team.
"Some other times [it's] inbox messages. People sending not nice messages to each other. Then again, there's wall posts and comments about others on a mutual friend's wall or in a status update.
"We do also see examples with photos. It can be as explicit as photos that have been photoshopped in an abusive manner or something as simple as captions, tags or comments."
The user operations team isn't bound by hard-and-fast rules on removing content. Rather, it is about using common sense, explains Marina.
They look at the context around reported abuse. A group may appear to attack an individual. However, in a surprising number of cases, the group may be set up by that person themselves, as a joke.
Dealing with online bullying is a contentious issue and all the social networks have, at some time, been criticised for not doing enough.
Facebook's user operations team consists of a couple of hundred staff around the world dealing with hundreds of millions of users.
However, they are aided by smart, automated systems, designed to root out online troublemakers.
"We can track, for example, if people are inviting large numbers in a certain demographic, that might flag up," says Colm Long, Facebook's director of online operations in Dublin.
The system looks for things like men befriending only females or teenage girls.
Critics have suggested that sites such as Facebook should go further and use keyword scanning to block all abusive content before it can be posted.
However, the international nature of social networks mean that may not be possible, according to Colm: "We are translated in over 60 different languages across the world.
"Our technology is set up to look for behaviours that we see across those different groups and across those different languages."
And however sophisticated the automated system is now, or becomes in the future, Facebook maintains that their key weapon in the battle against bullying will always be individual users speaking up.
"Facebook can't or shouldn't be part of every conversation that's happening on the platform," continues Colm. "The most important component is the community itself."