javascript - wait for one fetch to finish before starting the next - Stack Overflow

I have a list of data that I am sending to google cloud. My current code looks like this:const teams =

I have a list of data that I am sending to google cloud. My current code looks like this:

const teams = ['LFC', 'MUFC', 'CFC'];

teams.forEach(team => {
    fetch({
      url: URL,
      method: 'PUT',
      body: team
    });
})

This works with one team but it is timing out if sending multiple files and the files are bigger. I am sending images over and not strings. To solve this I need to POST the data one file by one, and wait for the previous POST to plete before sending the subsequent one. Can anyone advise the best way of doing this?

Worth noting that I don't have any control over the number of files that are uploaded.

I have a list of data that I am sending to google cloud. My current code looks like this:

const teams = ['LFC', 'MUFC', 'CFC'];

teams.forEach(team => {
    fetch({
      url: URL,
      method: 'PUT',
      body: team
    });
})

This works with one team but it is timing out if sending multiple files and the files are bigger. I am sending images over and not strings. To solve this I need to POST the data one file by one, and wait for the previous POST to plete before sending the subsequent one. Can anyone advise the best way of doing this?

Worth noting that I don't have any control over the number of files that are uploaded.

Share edited Mar 17, 2022 at 8:29 VLAZ 29.1k9 gold badges63 silver badges84 bronze badges asked Oct 21, 2019 at 18:59 peter flanaganpeter flanagan 9,84027 gold badges81 silver badges140 bronze badges 1
  • If you have access to the npm, you could install the bluebird and use Promise.reduce; Promise.reduce will execute an array of promises one by one as they resolved and allow you to "reduce" the previous results into one final result. – gabriel.hayes Commented Oct 21, 2019 at 19:04
Add a ment  | 

3 Answers 3

Reset to default 9

Use a reduce instead of forEach, with .then().

The following will store the promise of the last fetch in acc (the accumulator parameter of reduce), and appends the new fetch inside of a then listener, to ensure that the previous fetch is finished:

const teams = ['LFC', 'MUFC', 'CFC'];

teams.reduce((acc, team) => {
  return acc.then(() => {
    return fetch({
      url: URL,
      method: 'PUT',
      body: team
    });
  })
}, Promise.resolve())
  .then(() => console.log("Everything's finished"))
  .catch(err => console.error("Something failed:", err))

//Simulate fetch:
const fetch = team => new Promise(rs => setTimeout(() => {rs();console.log(team)}, 1000))

const teams = ['LFC', 'MUFC', 'CFC'];

teams.reduce((acc, team) => {
  return acc.then(() => {
    return fetch({
      url: URL,
      method: 'PUT',
      body: team
    });
  })
}, Promise.resolve())
  .then(() => console.log("Everything's finished"))
  .catch(err => console.error("Something failed:", err))

You can even write a general helper function for it:

const teams = ['LFC', 'MUFC', 'CFC'];

const promiseSeries = (arr, cb) => arr.reduce((acc, elem) => acc.then(() => cb(elem)), Promise.resolve())

promiseSeries(teams, (team) => {
  return fetch({
    url: URL,
    method: 'PUT',
    body: team
  })
})
  .then(() => console.log("Everything's finished"))
  .catch(err => console.error("Something failed:", err))

//Simulate fetch:
const fetch = team => new Promise(rs => setTimeout(() => {rs();console.log(team)}, 1000))

const teams = ['LFC', 'MUFC', 'CFC'];

const promiseSeries = (arr, cb) => arr.reduce((acc, elem) => acc.then(() => cb(elem)), Promise.resolve())

promiseSeries(teams, (team) => {
  return fetch({
    url: URL,
    method: 'PUT',
    body: team
  })
})
  .then(() => console.log("Everything's finished"))
  .catch(err => console.error("Something failed:", err))

Or, even better, if you can (it's ES2017), use async/await (it's more readable):

const teams = ['LFC', 'MUFC', 'CFC'];

async function upload(teams){
  for(const team of teams){
    await fetch({
      url: URL,
      method: 'PUT',
      body: team
    });
  }
}

upload(teams)
  .then(() => console.log("Everything's finished"))
  .catch(err => console.error("Something failed:", err))

//Simulate fetch:
const fetch = team => new Promise(rs => setTimeout(() => {rs();console.log(team)}, 1000))

const teams = ['LFC', 'MUFC', 'CFC'];

async function upload(teams) {
  for (const team of teams) {
    await fetch({
      url: URL,
      method: 'PUT',
      body: team
    });
  }
}

upload(teams)
  .then(() => console.log("Everything's finished"))
  .catch(err => console.error("Something failed:", err))

You can use async/await with a for...of loop. Each call will "hold" the loop, until it's done, and then the loop will continue the next call:

const teams = ['LFC', 'MUFC', 'CFC'];

async function send(teams) {
  for (const team of teams) {
    await fetch({
      url: URL,
      method: 'PUT',
      body: team
    });
  }
}

You can make use of async/await, as follows:

const teams = ['LFC', 'MUFC', 'CFC'];

teams.forEach(async (team) => {
    await fetch({
      url: URL,
      method: 'PUT',
      body: team
    });
})

发布者:admin,转转请注明出处:http://www.yc00.com/questions/1743692353a4491174.html

相关推荐

发表回复

评论列表(0条)

  • 暂无评论

联系我们

400-800-8888

在线咨询: QQ交谈

邮件:admin@example.com

工作时间:周一至周五,9:30-18:30,节假日休息

关注微信