How I Added CSV Importing In My React-Node.js Project

Let’s face it. Touch Base was fine. It was a cool project. It worked. But… was it really usable? (If you don’t know what I’m talking about, TB is a full stack React contacts management app).

I was thinking about this and realized something obvious. When a user starts using Touch Base they have to add contacts manually. Which might be fine if you have 5 contacts. If you have 1,000 contacts you want to add, this sucks… and you probably won’t want to use this system. So of course, I knew I had to add the ability to import contacts.

Researching Options

My first Google search was “csv importers”, or something like that. I looked through some of the options available and found FlatFile. Their main heading reads “The fastest way to collect, onboard and migrate data.” Perfect… except, it wasn’t all that for me. Now, this is probably my fault (they seem like an amazing service) but the process of implementing their importer was taking more effort than I was willing to put in for this. This is the perfect time for a little sidebar context:

Lately, I’ve been really valuing scrappiness. I want to get things done, fast. This isn’t about cutting corners, I just don’t want to have any excuses or unnecessary delays. After all, I’m just one guy working on side projects. So my current attitude is fail, learn, and iterate fast. All while doing good work.

Back to FlatFile. As much as I wanted to use their promising software I asked myself if I really needed all their bells and whistles and if fighting their docs was worth it. Definitely not. So I went back to my search search and landed on Papa Parse. I recalled seeing it in my previous search. Their main heading reads “The powerful, in-browser CSV parser for big boys and girls.” 😆 I was in.

Implementation

First things first, I added a POST route:

app.post("/app/import-contacts", verifyToken, upload.single("file"), async (req, res) => {
     ...
}

verifyToken’ is a function that does exactly that—it verifies the users id token. “upload.single(’file’)” is a multer function that I have set up to upload files to my s3 bucket.

Inside the route, I grab the user id and file through destructuring:

const { uid } = req.user;
const { file } = req;

Like the npm package docs for Papa Parse state, “Papa Parse can parse a Readable Stream instead of a File when used in Node.js environments (in addition to plain strings).”

So I prepared to stream the file directly to Papa Parse by creating said stream from my s3 bucket as well as an empty array to hold the results data. Can’t forget about handling potential errors.

const s3Stream = s3.getObject({
    Bucket: 'my-example-bucket',
      Key: file.key
}).createReadStream();

let parsedData = [];

Then I finally pass the stream to Papa Parse, set my config options and handle any errors coming from the results.

Papa.parse(s3Stream, {
     header: true,
     dynamicTyping: true,
     complete: async (results) => {
         if (results.errors.length > 0) {
             console.error('Error parsing CSV:', results.errors);
             return res.status(500).json({ error: "Error parsing CSV" });
         }

         parsedData = results.data;
         ...

In the code above, ‘complete’ is a Papa Parse property that takes a callback function. It executes once the parsing is complete. I then grab a hold of the data provided by ‘results’ as ‘parsedData’.

After this, it’s time to run some queries on the database and process the contacts. But, I need to store a connection the the db to run the queries on first.

const client = await pool.connect();

This next part is a lot of code in a try catch statement, so I’ll just give it to you straight with some comments on it.

try {
  await client.query("BEGIN");

  // fetch existing contacts
  const existingContacts = await client.query(
    "SELECT email FROM contacts WHERE user_id = $1",
    [uid]
  );

  // extract emails into a new set for quick lookup
  const existingEmails = new Set(
    existingContacts.rows.map((contact) => contact.email)
  );

  // filter out duplicates before inserting
  const filteredContacts = parsedData.filter((contact) => {
    if (existingEmails.has(contact.email)) {
      console.log(`Duplicate contact found: ${contact.email}`);
      return false;
    }
    return true;
  });

  // map filtered contacts to correct format
  const contactsToInsert = filteredContacts.map((contact) => [
    uid,
    contact.first_name,
    contact.last_name,
    contact.email,
    contact.phone,
    contact.address1,
    contact.address2,
    contact.city,
    contact.state,
    contact.zip,
    contact.categories,
    contact.photo_url,
    contact.photo_filename,
    contact.photo_mimetype,
    contact.photo_upload_time,
    contact.notes,
  ]);

  // don't insert if no new contacts
  if (contactsToInsert.length === 0) {
    await client.query("ROLLBACK");
    return res.status(400).json({ error: "No new contacts to import" });
  }

  // only insert non-duplicate contacts
  if (contactsToInsert.length > 0) {
    const query = format(
      "INSERT INTO contacts (user_id, first_name, last_name, email, phone, address1, address2, city, state, zip, categories, photo_url, photo_filename, photo_mimetype, photo_upload_time, notes) VALUES %L",
      contactsToInsert
    );
    await client.query(query);
  }

  await client.query("COMMIT");
  res.status(201).json({ message: "Contacts imported successfully" });
} catch (err) {
  await client.query("ROLLBACK");
  console.error("Database error:", err);
  res.status(500).json({ error: "Database error" });
} finally {
  client.release();
}

As you can see, I:

  • fetch existing contacts

  • filter out duplicate contacts using emails, since no two emails can be the same

  • bulk insert the non-duplicate contacts into the table

Frontend

The frontend will be largely specific to my approach of the app, but let’s connect the dots here.

The Import Contacts page does one thing so it’s very simple. I use the native file upload button which is really an input.

<input className="import-contacts-input" type="file" accept=".csv" onChange={handleFileUpload} />

When the input detects a change I trigger a handleFileUpload function.

Inside of the handleFileUpload function, I first set the loading state to true so that I can display my little loading spinner to the user while this process takes place.

const handleFileUpload = (e) => {
    setLoading(true);

When a user successfully uploads their .csv file, I append it to a new formData object and send it to my backend route above to process it. You can also see the error handling I have set up…

if (file) {
      const formData = new FormData();
      formData.append("file", file);

      fetch(`https://${backendURL}/app/import-contacts`, {
        method: 'POST',
        body: formData,
        headers: {
          Authorization: `Bearer ${idToken}`
        },
      })
      .then(response => response.json())
      .then(data => {
        if (data.error) {
          if (data.error === 'No new contacts to import') {
            setLoading(false);
            setToastAlert({
              visible: true,
              message: 'Only duplicate contacts found. No new contacts to import.',
              type: 'info'
            });
            return
          }
        }

        setLoading(false);
        setToastAlert({
          visible: true,
          message: 'Contacts imported successfully!',
          type: 'success'
        });
      })
      .catch((error) => {
        console.error('Error:', error);
        setLoading(false);
        setToastAlert({
          visible: true,
          message: 'Error importing contacts. Please refresh the page and try again.',
          type: 'error'
        });
      });
    } else {
      console.log('No file selected');
      setLoading(false);
      setToastAlert({
        visible: true,
        message: 'No file selected. Please select a file and try again.',
        type: 'error'
      });
    }
  };

Upon successful handling of the file or if it errors out, the loading state gets set back to false, and I trigger an appropriate toast alert to let the user know exactly what happened in a nice way.

It feels so nice to log in, upload a .csv file of contacts, get a successful toast alert, and then see all of the new contacts populated in your account. And its so quick. You might see the loading spinner for just a second. The bulk insert query also helps a lot there.

From Maybe Usable to Usable

Before adding this feature I wondered how usable the app truly was. Now, there’s no question about that. Although it wasn’t super complex, it’s a feature you would expect to see in this type of application so I found it a requirement to implement. I think it makes it a little more serious of a project. Aside from that, I’ve never done anything with .csv files which made this super fun to work on. Papa Parse integrated so well with all the tools I was already using which made it super easy. I definitely recommend using it.

If you made it this far, cheers to you for reading this 🥂…

and cheers to software that doesn’t suck 🥂

p.s I’m still wondering if my project sucks 😂

If you want to check it out here’s the link again - Touch Base. Til next time!